Vitual Sommelier : Final Project

Introduction to Data Science (MATH 4100/ COMP 5360), University of Utah.

Github Repository

Collaboration between the team was done on git: https://github.com/ArithmeticR/COMP_5360_Project

Team Members:

  • Brian Tillman
  • Li Jiada
  • Trevor Olsen

Background and Motivation

Brian has a background in User Experience Design. He is interested in using data to influnece interaction design.
Li has a background in hydroinformatics, especially the big data in water resource management. He is interested in data mining and data visualization
Trevor has a background in marketing research. He is interested in developing models for pricing products and identifying attributes to use in marketing.

Project Objectives

The primary objective is to use natural language processing on wine reviews to classify the category, country, and taster by machine learning. The secondary objective is to build an artificial neural network that classifies all three at once, then examine the output of the network ran backwards. The purpose being to discover insights into the different combinations of category, country, and taster.

What the team wished to learn and accomplish?

Brian - Learn more about how to apply data analysis to solve marketing issues. Learn more about NLP and processing emotion in writing.
Jiada - Learn more about how to visualize big data to dig out the valuable information which is hidden Benefits: With the data mining we can help people to change consumption behavior and save water
Trevor - Learn more about NLP, and about ensemble methods.

Data Processing

The data for the project was collected from https://www.winemag.com. The site contains 215,395 reviews of wines that include the price. The https://www.winemag.com/robots.txt site specifies the crawl-delay to be 5. Requiring around 300 hours (12.5 days) to scrape all the reviews from a single computer.

These are the fields extracted from each review:


Url - Where the review is hosted.
Title - Name of the review.
Points - Number of points given in the review. On a 100 point scale.
Description - The taster's review of the wine.
Price - Price of the wine. We think it's all in dollars.
Variety - Type of wine.
Appellation - a name or title of the wine.
Winery - Location the wine was produced.
Alcohol - Percent of achohol.
Bottle Size - Size of the wine bottle. It might all be in millilitres.
Category - Category of the wine.
Importer - Name of the importer.
Date Published - Date the review was published.
User Avg Rating - Rating of the review.
Taster - Name of the reviewer.

Libraries for scraping

In [ ]:
############################################
###
### Don't Run only for presentation
### 
############################################

import pandas as pd
import scipy as sc
import numpy as np
from bs4 import BeautifulSoup
import requests
import urllib.request
import pickle
import glob
import re
import time
import statsmodels.formula.api as sm

Get individual review urls

Extracting the data was very similar to the web scraping homework for the github repositories. The first step was scapping the url for the individual pages from the site filtering on price. This ensured that only reviews with price were pulled.

One thing to point out about the code (see * below) is the way the crawl delay was implemented. The process slept for 5 seconds minus the number of seconds transpired since the individual url request began.

In [ ]:
############################################
###
### Don't Run only for presentation
### 
############################################

session = requests.Session()
HEADERS = {
    'user-agent': ('Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 '
                   '(KHTML, like Gecko) Chrome/48.0.2564.109 Safari/537.36')
}
first_page = 1
last_page = 7180
results_url = "https://www.winemag.com/?s=&drink_type=wine&price=1.0-15.99,16.0-25.99,100.0-199.99,76.0-99.99,61.0-75.99,41.0-60.99,26.0-40.99,200.0-*&page="
raw_pages = []
for i in range(first_page, last_page + 1 ):
    time_from_request = time.time() # (*)
    url = results_url + str(i)
    print(i)
    response = session.get(url, headers=HEADERS)
    my_page = BeautifulSoup(response.content, 'html.parser')
    raw_review_urls = [ review.get("href") for review in my_page.select(".review-item a")]
    clean_review_urls = [ my_url for my_url in raw_review_urls  if   bool(re.search(r'^https://www.winemag.com/buying-guide/', my_url))]
    pickle.dump( clean_review_urls, open( "urls/raw_pages"+str(i)+".p", "wb" ) )
    if time.time() - time_from_request < 5: # (*)
        time.sleep(5.01 - (time.time() - time_from_request))

Not shown in the code above is the step where all the pickled files are concatenated together to form a master list of the individual review urls.

Scape individual reviews

Each review is written out to disk once it is scraped. These are some of the benefits from using this approach:

  • Each url request has to take 5 seconds, so might as well utilize some of that time doing something other than sleeping.
  • Only can lose one review at a time when the process crashes.
  • If the process does crash, then it can quickly find where it left off.

This is the function used to write each review out to disk. Notice that it wasn't exactly clear what fields would be contained in the primary and secondary blocks. The function concatenates the field name and values with "||||". The assumption is that "||||" is unique enough not to appear in the field name.

In [ ]:
############################################
###
### Don't Run only for presentation
### 
############################################

def write_my_file(i, raw_review_pages):
    file = open("reviews/url_" + str(i) + ".txt", "w")
    x = raw_review_pages
    url_i = x[0]
    title = x[1]
    points = x[2]
    description = x[3]
    taster = x[8]
    primary_info_label = x[4]
    primary_info = x[5]
    secondary_info_label = x[6]
    secondary_info = x[7]
    file.write(str(url_i).replace('\n', '').replace('\t', ''))
    file.write("\t")
    file.write(str(title).replace('\n', '').replace('\t', ''))
    file.write("\t")
    file.write(str(points).replace('\n', '').replace('\t', ''))
    file.write("\t")
    file.write(str(description).replace('\n', '').replace('\t', ''))
    file.write("\t")
    file.write(str(taster).replace('\n', '').replace('\t', ''))
    file.write("\t")
    for y, z in zip(primary_info_label, primary_info):
        file.write(str(y).replace('\n', '').replace('\t', '').replace('<span>', '').replace('</span>', ''))
        file.write("||||")
        z = str(z).replace('\n', '').replace('\t', '')
        z = re.sub(r"<.+?>", "", z)
        file.write(z)
        file.write("\t")
    for y, z in zip(secondary_info_label, secondary_info):
        y = str(y).replace('\n', '').replace('\t', '').replace('<span>', '').replace('</span>', '')
        if y != "User Avg Rating":
            file.write(str(y).replace('\n', '').replace('\t', '').replace('<span>', '').replace('</span>', ''))
            file.write("||||")
            z = str(z).replace('\n', '').replace('\t', '')
            z = re.sub(r"<.+?>", "", z)
            file.write(z)
            file.write("\t")
    file.write("\n")
    file.close()  

This code uses the file 'not_picked.csv'. This file contains all the urls that had not been scrapped by the beginning of the process. The code to create this file is not shown, but this is done by preforming a set difference on the master url list with the urls that have already been pulled. This was done twice to ensure all the urls were indeed pulled. Some of the page requests failed the first time they were tried, and others never worked.

The code checks at each iteration whether the file with the 'page_i_in_loop' index already exists. The cost in inefficiency was offset by the ease of restarting the process upon failure.

In [ ]:
############################################
###
### Don't Run only for presentation
### 
############################################

df = pd.read_csv('not_picked.csv')
all_urls = df.x.values.tolist()
 
session = requests.Session()
HEADERS = {
    'user-agent': ('Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 '
                   '(KHTML, like Gecko) Chrome/48.0.2564.109 Safari/537.36')
}
first_page = 1
last_page = 215395
for page_i_in_loop in range(first_page, last_page + 1):
    time_from_request = time.time()
    my_files = glob.glob("reviews/*.txt")
    ## check string twice because the differences in windows and linux machines
    if "reviews/url_" + str(page_i_in_loop) + ".txt" not in my_files and "reviews\\url_" + str(page_i_in_loop) + ".txt" not in my_files :
        url_i = all_urls[page_i_in_loop]
        try:
            response = session.get(url_i, headers=HEADERS)
            soup_review_page = BeautifulSoup(response.content, 'html.parser')
            structure_reviews = []
            try:
                title = soup_review_page.select(".heading-area .article-title")[0].text
            except:
                title = None
            try:
                points = soup_review_page.select(".rating #points")[0].text
            except:
                points = None
            try:
                description = soup_review_page.select(".description")[0].text
            except:
                description = None
            try:
                primary_info_label = soup_review_page.select(".primary-info .row .info-label span")
            except:
                primary_info_label = None
            try:
                primary_info = soup_review_page.select(".primary-info .row .info")
            except:
                primary_info = None
            try:
                secondary_info = soup_review_page.select(".secondary-info .row .info")
            except:
                secondary_info = None
            try:
                secondary_info_label = soup_review_page.select(".secondary-info .row .info-label span")
            except:
                secondary_info_label = None
            try:
                taster = soup_review_page.select(".taster .name")[0].text
            except:
                taster = None
            print(page_i_in_loop)
            structure_reviews=[url_i, title, points, description, primary_info_label,
                                      primary_info, secondary_info_label,
                                      secondary_info, taster]
            print(page_i_in_loop)
            write_my_file(page_i_in_loop, structure_reviews)
        except Exception as e:
            print(str(e))
        if time.time() - time_from_request < 5:
            time.sleep(5.01 - (time.time() - time_from_request))

Side Notes

Truth be told, collecting the data was more of a pain than first thought. First we tried pickling the reviews in batches, but the files sizes were enormous. Still not sure exactly why. Maybe it was because of the string encoding or the nested structure we were using. Some time was wasted before switching to text files for storage (which greatly reduced the size). This was a bit of a time consuming process and we needed 13 days to pull the data on a single computer.

So, what ended up happening was running 10 linux servers (The max number allow by default at digitalocean.com) and collecting the reviews once the process was refined. This code above is very similar to the code ran on these machines. The github repo contains 24 branches for this reason.

R code - Consolidate Reviews

The right encoding was difficult to find in python (encoding = "ISO-8859-1"). For this reason, R was initially used to consolidate the reviews. For whatever reason, R was able to read the files in without any encoding issues. Some care had to be taken to handle the fields from the primary and secondary blocks.

In [ ]:
######################################
###
#### R Code
###
#######################################

############################################
###
### Don't Run only for presentation
### 
############################################
all_files <- list.files(path = "reviews", pattern = ".txt", 
             all.files = FALSE,
             full.names = T, recursive = FALSE,
             ignore.case = FALSE, include.dirs = FALSE, no.. = FALSE)

my_input <- matrix(NA,nrow=300000 ,ncol=20)
myinput_list <-  vector("list", length = 300000)
i = 1
for(my_file in all_files){
  con = file(my_file, "r")
  while ( TRUE ) {
    line = readLines(con, n = 1)
    if ( length(line) == 0 ) {
      break
    }
    myinput_list[i] <-strsplit(line,"\t")
    i = i + 1
  }
  close(con)
}
myinput_list2 <- purrr::compact(myinput_list)
myinput_list3 <- myinput_list2
my_matrix <- matrix(NA, nrow=length(myinput_list3), ncol=15)
for(my_name in seq_along(myinput_list3)){
  my_length <- length(myinput_list3[[my_name]])
  if(my_length>12){
  my_matrix[my_name,1:my_length] <- myinput_list3[[my_name]]
  }
}
my_df <- as.data.frame(my_matrix,stringsAsFactors=FALSE)
my_df2 <- my_df[!duplicated(my_df$V1),]
colnames(my_df2) <- c('url','title','points','description','taster',paste0('V',6:15))
table(my_df2$taster)
my_df2[1,] 
save(my_df2,file="unstructured_df.rdata")

R code - Basic Cleaning

Some basic cleaning and transformations were done in R, since the encoding still had not been figured out for python.

The code for handling the primary and secondary blocks (code between the (*) comments) is very inefficient, but was offset by the ease of coding.

The files were chunked into 20,000 review blocks. This seemed to be the right number to keep the chunks under the 25 mb limit for pushing to github.

In [ ]:
######################################
###
#### R Code
###
#######################################

############################################
###
### Don't Run only for presentation
### 
############################################
load(file="unstructured_df.rdata")

# (*) handling the primary and secondary blocks
for(i in 1:nrow(my_df2)){ 
  for(j in 6:15){
    my_split <- strsplit(my_df2[i,j],"||||",fixed=T)[[1]]
    if(!make.names(my_split[1]) %in% colnames(my_df2)){
      my_df2[[make.names(my_split[1]) ]] <- NA
    }
    my_df2[i,make.names(my_split[1]) ] <- my_split[2]
  }
}
# (*)

my_df3  <- my_df2[, -c(6:15)]
my_df3  <- my_df3[, -c(15)] 


my_df3$Price2 <- as.numeric(gsub("$","",gsub(",  Buy Now","",my_df3$Price), fixed = T))
my_df3$Alcohol2 <- as.numeric(gsub("%","",my_df3$Alcohol))/100
my_df3$Bottle.Size2 <- gsub("ml","",gsub(" ","",tolower(my_df3$Bottle.Size), fixed = T))
my_df3$milliliters <-ifelse(grepl("l",my_df3$Bottle.Size2), 
                            as.numeric(gsub("l","",my_df3$Bottle.Size2)) * 1000, 
                            as.numeric(my_df3$Bottle.Size2))

my_df3$price_per_liter <- 1000*my_df3$Price2/   my_df3$milliliters

Appellations <- strsplit(my_df3$Appellation,",")
my_df3$l1<- NA
my_df3$l2<- NA
my_df3$l3<- NA
my_df3$l4<- NA
my_df3$l5<- NA
i=2
for(i in 1:nrow(my_df3)){
  my_df3[i,paste0('l',(5-length(Appellations[[i]])+1):5)] <- trimws(Appellations[[i]])
}
names_vector <- c("url" , "Date.Published","title", "taster"  ,
                  "Alcohol" , "Alcohol2",
                  "Bottle.Size","Bottle.Size2", "milliliters" ,
                  "points","Price", "Price2"  ,"price_per_liter",
                    "Importer"     , "Winery" ,
                  "Appellation"   ,  "l1" ,  "l2" ,"l3" ,   "l4", "l5"
                    ,  "Designation","Category" , "Variety" , "description")
my_df3 <- my_df3[,names_vector]
for(i in 1:10){
  write.table(my_df3[((i-1)*20000+1):((i)*20000),], 
  file=paste0("structured_df_",i,".txt"), sep="\t", row.names = F)
}
write.table(my_df3[200001:214103,],file=paste0("structured_df_",11,".txt"), sep="\t", row.names = F)
save(my_df3,file="structured_df.rdata")
write.table(my_df3,file="structured_df.txt", sep="\t", row.names = F)

Libraries for the rest of the code

must rerun for interactive chart!

In [1]:
%%javascript
require.config({
  paths: {
    highcharts: "http://code.highcharts.com/highcharts",
    highcharts_exports: "http://code.highcharts.com/modules/exporting",
  },
  shim: {
    highcharts: {
      exports: "Highcharts",
      deps: ["jquery"]
    },
    highcharts_exports: {
      exports: "Highcharts",
      deps: ["highcharts"]
    }
  }
});
In [2]:
from PIL import Image
from wordcloud import WordCloud
import math
import numpy as np
import scipy as sc
import pandas as pd

import pylab 
import scipy.stats as stats
import statsmodels.api as sm
from sklearn.svm import SVC
from sklearn.svm import LinearSVC
from sklearn import linear_model
from sklearn.utils import shuffle

from sklearn.neighbors import NearestNeighbors
from sklearn.neighbors import KNeighborsClassifier
from sklearn.neighbors import KNeighborsRegressor
from sklearn import svm
from sklearn.model_selection import cross_val_score
from sklearn.model_selection import train_test_split
import seaborn as sns
import matplotlib.pyplot as plt
%matplotlib inline
plt.rcParams['figure.figsize'] = (15, 9)
plt.style.use('ggplot')

from sklearn.externals import joblib
from sklearn.preprocessing import StandardScaler

from sklearn.preprocessing import scale
from sklearn.ensemble import GradientBoostingClassifier
from sklearn.ensemble import GradientBoostingRegressor
from sklearn.ensemble import RandomForestClassifier
from sklearn.ensemble import RandomForestRegressor
from sklearn.ensemble import ExtraTreesClassifier
from sklearn.ensemble import ExtraTreesRegressor
from sklearn.ensemble import BaggingClassifier
from sklearn.ensemble import BaggingRegressor
from sklearn.ensemble import AdaBoostClassifier
from sklearn.ensemble import AdaBoostRegressor


from sklearn.ensemble import VotingClassifier

from sklearn.neural_network import MLPClassifier
from sklearn.neural_network import MLPRegressor

from sklearn.preprocessing import Normalizer

from sklearn.decomposition import TruncatedSVD
from sklearn.naive_bayes import MultinomialNB
from sklearn import metrics

from nltk.corpus import stopwords
from sklearn.feature_extraction.text import TfidfVectorizer

from sklearn.linear_model import ElasticNet

from sklearn import metrics
from sklearn.tree import DecisionTreeClassifier
from sklearn.tree import DecisionTreeRegressor
/anaconda3/lib/python3.6/site-packages/statsmodels/compat/pandas.py:56: FutureWarning: The pandas.core.datetools module is deprecated and will be removed in a future version. Please use the pandas.tseries module instead.
  from pandas.core import datetools

Getting Data into Python

Read in data

The right encoding turned out to be "ISO-8859-1". The files had been chunked into file sizes below 25 mbs. This allowed them to be stored on github.

In [3]:
df = pd.read_csv('structured_df_1.txt',sep='\t', encoding = "ISO-8859-1")
for i in range(1,12):
    df = df.append(pd.read_csv('structured_df_'+str(i)+'.txt',sep='\t', encoding = "ISO-8859-1"))
df = df.drop_duplicates()
df.shape
Out[3]:
(214103, 25)
In [4]:
df2 = df[df.description.notnull()]
df2 = df2[df2.points.notnull()]
df2 = df2[df2.price_per_liter.notnull()]
df2.shape
Out[4]:
(209895, 25)
In [5]:
df2[['price_per_liter_clip']]= df2[['price_per_liter']].clip(0,100)

Exploratory Analysis

In [5]:
df2.dtypes
Out[5]:
url                      object
Date.Published           object
title                    object
taster                   object
Alcohol                  object
Alcohol2                float64
Bottle.Size              object
Bottle.Size2             object
milliliters             float64
points                  float64
Price                    object
Price2                  float64
price_per_liter         float64
Importer                 object
Winery                   object
Appellation              object
l1                       object
l2                       object
l3                       object
l4                       object
l5                       object
Designation              object
Category                 object
Variety                  object
description              object
price_per_liter_clip    float64
dtype: object
In [6]:
df2.describe()
Out[6]:
Alcohol2 milliliters points Price2 price_per_liter price_per_liter_clip
count 178749.000000 209895.000000 209895.000000 209895.000000 209895.000000 209895.000000
mean 0.137098 748.129827 88.072612 33.859905 45.854517 41.009230
std 0.226622 76.117666 3.155364 41.619085 58.627023 25.460287
min 0.015000 187.000000 22.000000 4.000000 5.000000 5.000000
25% 0.130000 750.000000 86.000000 16.000000 21.333333 21.333333
50% 0.135000 750.000000 88.000000 25.000000 33.333333 33.333333
75% 0.144000 750.000000 90.000000 40.000000 53.333333 53.333333
max 83.330000 3000.000000 100.000000 6000.000000 8000.000000 100.000000
In [7]:
g = sns.PairGrid(df2[['milliliters','points','price_per_liter','Category']], hue="Category")
g = g.map_diag(plt.hist)
g = g.map_offdiag(plt.scatter)
g = g.add_legend()

There are a lot of missing values for alcohol. Plus some values coded on the wrong scale.

In [8]:
df2.taster.value_counts()
Out[8]:
None                  63705
Roger Voss            30078
Michael Schachner     28296
Paul Gregutt          17770
Joe Czerwinski        12903
Kerin O’Keefe       11682
Virginie Boone        11223
Matt Kettmann          7189
Anna Lee C. Iijima     5529
Sean P. Sullivan       5478
Jim Gordon             4700
Anne Krebiehl MW      3824
Lauren Buzzeo          2624
Susan Kostrzewa        2455
Alexander Peartree      604
Mike DeSimone           597
Jeff Jenssen            540
Christina Pickard       340
Carrie Dykes            169
Sarah E. Daniels        134
Fiona Adams              55
Name: taster, dtype: int64

30% of the reviews do not have an offical taster. They are labeled 'None'.

In [9]:
df2.Category.value_counts()
Out[9]:
Red            127823
White           62802
Sparkling        9033
Rose             5910
Dessert          2837
Port/Sherry      1380
Fortified         110
Name: Category, dtype: int64

60% of the reviews are red, and 91% are either red or white.

In [10]:
df2.l5.value_counts()
Out[10]:
US                        94567
France                    28196
Italy                     27141
Spain                     11775
Chile                      8094
Portugal                   7393
Australia                  7281
Argentina                  7104
Austria                    4276
Germany                    3856
South Africa               3251
New Zealand                3203
Greece                      875
Israel                      771
Canada                      342
Hungary                     239
Bulgaria                    189
Uruguay                     186
Romania                     166
Croatia                     109
Turkey                      106
Georgia                     106
Mexico                       96
Moldova                      92
Slovenia                     91
Brazil                       74
England                      67
Lebanon                      52
Morocco                      28
Peru                         20
Cyprus                       20
Ukraine                      19
Macedonia                    18
Czech Republic               15
Switzerland                  12
Serbia                       11
India                         9
Luxembourg                    8
China                         8
Kosovo                        6
Armenia                       5
Lithuania                     4
Bosnia and Herzegovina        4
South Korea                   3
Japan                         2
Slovakia                      2
Montenegro                    1
US-France                     1
Albania                       1
Name: l5, dtype: int64
In [11]:
pd.crosstab(df2.taster,df2.Category)
Out[11]:
Category Dessert Fortified Port/Sherry Red Rose Sparkling White
taster
Alexander Peartree 7 4 4 312 27 13 237
Anna Lee C. Iijima 192 0 0 1342 200 188 3607
Anne Krebiehl MW 97 0 0 588 59 354 2726
Carrie Dykes 5 1 1 96 6 0 60
Christina Pickard 0 0 0 190 0 8 142
Fiona Adams 0 0 0 36 0 8 11
Jeff Jenssen 34 0 0 235 18 15 238
Jim Gordon 46 1 7 3112 147 126 1261
Joe Czerwinski 242 14 157 6282 438 408 5362
Kerin O’Keefe 127 7 0 8166 135 947 2300
Lauren Buzzeo 29 25 0 1381 246 44 899
Matt Kettmann 38 0 8 4804 236 79 2024
Michael Schachner 152 2 253 19591 662 1254 6382
Mike DeSimone 2 0 0 422 23 4 146
None 985 7 98 41773 975 2459 17408
Paul Gregutt 89 1 6 12089 288 297 5000
Roger Voss 551 45 820 14358 1933 2576 9795
Sarah E. Daniels 0 0 0 57 0 3 74
Sean P. Sullivan 78 2 2 3975 150 29 1242
Susan Kostrzewa 96 0 4 1207 94 37 1017
Virginie Boone 67 1 20 7807 273 184 2871
In [12]:
pd.crosstab(df2.l5,df2.taster)
Out[12]:
taster Alexander Peartree Anna Lee C. Iijima Anne Krebiehl MW Carrie Dykes Christina Pickard Fiona Adams Jeff Jenssen Jim Gordon Joe Czerwinski Kerin O’Keefe ... Matt Kettmann Michael Schachner Mike DeSimone None Paul Gregutt Roger Voss Sarah E. Daniels Sean P. Sullivan Susan Kostrzewa Virginie Boone
l5
Albania 0 0 0 0 0 0 1 0 0 0 ... 0 0 0 0 0 0 0 0 0 0
Argentina 0 0 0 0 0 0 0 0 5 0 ... 0 7026 0 71 2 0 0 0 0 0
Armenia 0 0 0 0 0 0 0 0 0 0 ... 0 0 5 0 0 0 0 0 0 0
Australia 0 0 0 0 185 0 0 0 3930 0 ... 0 118 0 3041 2 3 2 0 0 0
Austria 0 0 2259 0 0 0 0 0 268 0 ... 0 4 0 24 0 1720 1 0 0 0
Bosnia and Herzegovina 0 2 0 0 0 0 2 0 0 0 ... 0 0 0 0 0 0 0 0 0 0
Brazil 0 0 0 0 0 0 0 0 0 0 ... 0 71 0 3 0 0 0 0 0 0
Bulgaria 0 25 0 0 0 0 136 0 8 0 ... 0 0 1 3 0 0 0 0 16 0
Canada 0 24 0 0 0 0 0 0 23 0 ... 0 0 0 16 201 0 0 34 42 0
Chile 0 0 0 0 0 0 0 0 51 0 ... 1 7652 0 382 4 3 1 0 0 0
China 0 0 0 0 0 0 0 0 0 0 ... 0 0 4 1 0 0 0 0 3 0
Croatia 0 57 0 0 0 0 47 0 0 0 ... 0 0 0 3 0 0 0 0 2 0
Cyprus 0 0 0 0 0 0 0 0 0 0 ... 0 0 0 3 0 0 1 0 16 0
Czech Republic 0 7 0 0 0 0 8 0 0 0 ... 0 0 0 0 0 0 0 0 0 0
England 0 0 67 0 0 0 0 0 0 0 ... 0 0 0 0 0 0 0 0 0 0
France 0 120 1498 0 0 0 0 0 2816 0 ... 0 362 0 1249 207 20995 4 0 2 0
Georgia 0 25 0 0 0 0 0 0 0 0 ... 0 0 72 3 0 0 0 0 6 0
Germany 0 2187 0 0 0 0 0 0 1416 0 ... 0 0 0 141 15 95 2 0 0 0
Greece 0 0 0 0 0 0 1 0 82 0 ... 0 1 0 87 0 0 9 0 695 0
Hungary 0 47 0 0 0 0 108 0 1 0 ... 0 0 0 32 0 0 0 0 51 0
India 0 1 0 0 0 0 0 0 0 0 ... 0 0 8 0 0 0 0 0 0 0
Israel 0 0 0 0 0 0 0 0 51 0 ... 0 1 367 25 0 0 9 0 17 0
Italy 81 0 0 0 0 0 0 0 422 11682 ... 0 1153 0 13575 8 170 50 0 0 0
Japan 0 0 0 0 0 0 0 0 0 0 ... 0 0 0 0 0 0 0 0 2 0
Kosovo 0 0 0 0 0 0 6 0 0 0 ... 0 0 0 0 0 0 0 0 0 0
Lebanon 0 18 0 0 0 0 0 0 1 0 ... 0 0 19 2 0 0 2 0 10 0
Lithuania 0 0 0 0 0 0 0 0 0 0 ... 0 0 0 0 0 0 0 0 4 0
Luxembourg 0 0 0 0 0 0 5 0 0 0 ... 0 0 0 0 0 0 0 0 3 0
Macedonia 0 1 0 0 0 0 7 0 0 0 ... 0 0 0 4 0 0 0 0 6 0
Mexico 0 0 0 0 0 0 0 0 0 0 ... 0 90 0 6 0 0 0 0 0 0
Moldova 0 1 0 0 0 0 61 0 0 0 ... 0 0 0 15 0 0 0 0 15 0
Montenegro 0 1 0 0 0 0 0 0 0 0 ... 0 0 0 0 0 0 0 0 0 0
Morocco 0 1 0 0 0 0 0 0 0 0 ... 0 0 25 1 0 1 0 0 0 0
New Zealand 0 0 0 0 155 0 0 0 2852 0 ... 0 17 0 179 0 0 0 0 0 0
Peru 0 0 0 0 0 0 0 0 0 0 ... 0 20 0 0 0 0 0 0 0 0
Portugal 0 0 0 0 0 0 0 0 303 0 ... 0 52 0 68 0 6966 2 0 1 0
Romania 0 58 0 0 0 0 64 0 5 0 ... 0 0 0 11 0 0 0 0 28 0
Serbia 0 0 0 0 0 0 11 0 0 0 ... 0 0 0 0 0 0 0 0 0 0
Slovakia 0 1 0 0 0 0 0 0 0 0 ... 0 0 0 0 0 0 0 0 1 0
Slovenia 0 31 0 0 0 0 59 0 0 0 ... 0 0 0 1 0 0 0 0 0 0
South Africa 0 0 0 0 0 0 0 0 30 0 ... 0 26 0 839 0 102 1 0 910 0
South Korea 0 0 0 0 0 0 0 0 0 0 ... 0 0 0 0 0 0 0 0 3 0
Spain 0 0 0 0 0 0 0 0 97 0 ... 0 11211 0 438 12 11 4 0 1 0
Switzerland 0 0 0 0 0 0 10 0 0 0 ... 0 0 0 0 0 0 0 0 2 0
Turkey 0 10 0 0 0 0 0 0 0 0 ... 0 0 96 0 0 0 0 0 0 0
US 523 2907 0 169 0 55 0 4700 542 0 ... 7188 310 0 43477 17319 12 46 5444 619 11223
US-France 0 0 0 0 0 0 0 0 0 0 ... 0 0 0 1 0 0 0 0 0 0
Ukraine 0 5 0 0 0 0 14 0 0 0 ... 0 0 0 0 0 0 0 0 0 0
Uruguay 0 0 0 0 0 0 0 0 0 0 ... 0 182 0 4 0 0 0 0 0 0

49 rows × 21 columns

In [13]:
pd.crosstab(df2.l5,df2.Category)
Out[13]:
Category Dessert Fortified Port/Sherry Red Rose Sparkling White
l5
Albania 0 0 0 1 0 0 0
Argentina 9 0 0 5547 107 71 1370
Armenia 0 0 0 3 1 0 1
Australia 87 2 34 4740 94 162 2162
Austria 286 0 0 972 80 113 2825
Bosnia and Herzegovina 0 0 0 3 0 0 1
Brazil 0 0 0 40 0 28 6
Bulgaria 0 0 0 120 6 1 62
Canada 88 0 0 119 4 3 128
Chile 57 0 0 5578 103 26 2330
China 0 0 0 5 1 0 2
Croatia 2 0 0 51 5 0 51
Cyprus 5 0 0 8 0 0 7
Czech Republic 0 0 0 6 0 0 9
England 0 0 0 0 0 67 0
France 377 46 2 12813 2227 3308 9423
Georgia 1 0 0 48 4 4 49
Germany 165 0 0 228 27 73 3363
Greece 30 0 0 389 31 0 425
Hungary 67 0 0 66 5 6 95
India 0 0 0 5 0 0 4
Israel 10 0 1 585 16 9 150
Italy 554 11 7 17999 319 2378 5873
Japan 0 0 0 0 0 0 2
Kosovo 0 0 0 4 0 0 2
Lebanon 0 0 0 25 7 0 20
Lithuania 0 0 0 0 0 4 0
Luxembourg 0 0 0 0 0 2 6
Macedonia 0 0 0 12 0 0 6
Mexico 0 0 0 63 4 0 29
Moldova 5 0 0 50 1 9 27
Montenegro 0 0 0 1 0 0 0
Morocco 0 0 0 18 1 0 9
New Zealand 12 0 0 1241 13 23 1914
Peru 0 0 0 14 0 1 5
Portugal 23 37 994 4044 288 97 1910
Romania 1 0 0 77 3 1 84
Serbia 0 0 0 6 0 0 5
Slovakia 0 0 0 0 0 0 2
Slovenia 1 0 0 22 2 2 64
South Africa 31 0 5 1765 73 46 1331
South Korea 0 0 0 3 0 0 0
Spain 78 1 223 7623 448 971 2431
Switzerland 0 0 0 7 2 0 3
Turkey 0 0 0 74 7 2 23
US 944 12 114 63290 2026 1617 26564
US-France 0 0 0 0 0 0 1
Ukraine 2 0 0 3 1 9 4
Uruguay 2 1 0 155 4 0 24
In [14]:
df2.boxplot(column='points', by=None)
Out[14]:
<matplotlib.axes._subplots.AxesSubplot at 0x25d4f8c7ef0>
In [15]:
df2.boxplot(column='points', by='Category')
Out[15]:
<matplotlib.axes._subplots.AxesSubplot at 0x25d56e5a518>
In [16]:
df2.boxplot(column='points', by='l5',rot=45)
Out[16]:
<matplotlib.axes._subplots.AxesSubplot at 0x25d58a69c50>
In [17]:
df2.boxplot(column='points', by='taster',rot=45)
Out[17]:
<matplotlib.axes._subplots.AxesSubplot at 0x25d5f49e240>
In [18]:
df2.boxplot(column='price_per_liter', by=None,rot=45)
Out[18]:
<matplotlib.axes._subplots.AxesSubplot at 0x25d60100240>
In [19]:
df2.boxplot(column='price_per_liter', by=None,rot=45 , showfliers=False)
Out[19]:
<matplotlib.axes._subplots.AxesSubplot at 0x25d5f852b70>
In [20]:
df2.boxplot(column='price_per_liter', by='Category',rot=45)
Out[20]:
<matplotlib.axes._subplots.AxesSubplot at 0x25d5ade7198>
In [21]:
df2.boxplot(column='price_per_liter', by='Category',rot=45 , showfliers=False)
Out[21]:
<matplotlib.axes._subplots.AxesSubplot at 0x25d5f91dcc0>
In [22]:
df2.boxplot(column='price_per_liter', by='l5',rot=45)
Out[22]:
<matplotlib.axes._subplots.AxesSubplot at 0x25d5fa67eb8>
In [23]:
df2.boxplot(column='price_per_liter', by='l5',rot=45 , showfliers=False)
Out[23]:
<matplotlib.axes._subplots.AxesSubplot at 0x25d600702e8>
In [24]:
df2.boxplot(column='price_per_liter', by='taster',rot=45)
Out[24]:
<matplotlib.axes._subplots.AxesSubplot at 0x25d6229b390>
In [25]:
df2.boxplot(column='price_per_liter', by='taster',rot=45, showfliers=False)
Out[25]:
<matplotlib.axes._subplots.AxesSubplot at 0x25d638e0860>
In [26]:
sns.set_style("whitegrid")
ax = sns.stripplot(x='points', y='price_per_liter_clip', jitter=True, data=df2, alpha=.25)
In [27]:
sns.set_style("whitegrid")
ax = sns.stripplot(x='points', y='price_per_liter_clip', hue="Category", data=df2, jitter=True, alpha=.25)
In [28]:
sns.set_style("whitegrid")
ax = sns.stripplot(x='points', y='price_per_liter_clip', hue="taster", data=df2, jitter=1, alpha=.25)
In [29]:
ax = sns.violinplot(x='points', y='price_per_liter_clip', data=df2,inner=None, color=".8")
In [30]:
g = sns.factorplot(x="Category", y="points",
                   col="taster",
                   data=df2, kind="strip",
                   jitter=True,
                   size=4, aspect=.7);
In [13]:
stats.probplot(np.ravel(df2[['points']]), dist="norm", plot=pylab)
pylab.show()
stats.probplot(np.ravel(df2[['price_per_liter']]), dist="norm", plot=pylab)
pylab.show()
stats.probplot(np.ravel(df2[['price_per_liter_clip']]), dist="norm", plot=pylab)
pylab.show()

Visualize PCA

In [187]:
pca_i = 20
vectorizer = TfidfVectorizer()
#vectorizer = TfidfVectorizer(stop_words=stopwords.words('english'))
vectorizer.fit(df2.description.tolist())
Tfidf_df = vectorizer.transform(df2.description.tolist())
my_normalizer1 = Normalizer()
my_normalizer1.fit(Tfidf_df)
Tfidf_df = my_normalizer1.transform(Tfidf_df)
svd1 = TruncatedSVD(n_components=pca_i, n_iter=7, random_state=42)
svd1.fit(Tfidf_df)
Out[187]:
TruncatedSVD(algorithm='randomized', n_components=20, n_iter=7,
       random_state=42, tol=0.0)
In [201]:
def word_cloud_for_pca(component_i):
    old_component_i = component_i
    component_i = np.flip(np.argsort(svd1.explained_variance_ratio_),0)[component_i]
    
    pca_c_i = svd1.components_[component_i]
    high_indexes = np.where(np.abs(pca_c_i)>.03)
    my_features = vectorizer.get_feature_names()
    my_dic ={}
    for x in high_indexes[0]:
        my_dic[my_features[x]]= math.floor(pca_c_i[x]*1000)
    map_mask = np.array(Image.open("wine2_removed.png"))
    wc = WordCloud(background_color="white", max_words=4000, mask=map_mask)
    wc.generate_from_frequencies(my_dic)
    plt.imshow(wc, interpolation='bilinear')
    variance_explained = round(abs(svd1.explained_variance_ratio_[component_i]) * 100,2)
    plt.title("principal component "+str(old_component_i+1) + " - Variance Explained "+str(variance_explained)+"%")
    plt.axis("off")
    plt.show()
In [202]:
for i in range(0,20):
    word_cloud_for_pca(i)

Regression on Points

Function for Spliting the Data for Regression on Points

The code for spliting and transforming the data is placed in a function instead of duplicating the code every where.

In [6]:
def return_model_data_points(train_size, pca_i, input_df, keep_vars, save_tools = False):

    Category_df = pd.get_dummies(input_df[['Category']])
    Category_df = Category_df.drop(Category_df.columns[0], axis=1)
    Category_df = Category_df.reset_index(drop=True)
    l5_df = pd.get_dummies(input_df[['l5']])
    l5_df = l5_df.drop(l5_df.columns[0], axis=1)
    l5_df = l5_df.reset_index(drop=True)
    
    dummy_df = pd.concat([Category_df,l5_df], axis=1)
    
    
    train_test_split_output = train_test_split(input_df, dummy_df,input_df[['points']] , random_state=1, test_size=1-train_size)

    df_train, df_test, dummy_df_train, dummy_df_test, y_train, y_test = train_test_split_output

    vectorizer = TfidfVectorizer()
    #vectorizer = TfidfVectorizer(stop_words=stopwords.words('english'))
    vectorizer.fit(df_train.description.tolist())

    Tfidf_df_train = vectorizer.transform(df_train.description.tolist())
    Tfidf_df_test = vectorizer.transform(df_test.description.tolist())

    input_df_train = df_train.reset_index(drop=True).copy(deep=True)
    input_df_test = df_test.reset_index(drop=True).copy(deep=True)

    my_normalizer1 = Normalizer()
    my_normalizer1.fit(Tfidf_df_train)

    Tfidf_df_train = my_normalizer1.transform(Tfidf_df_train)
    Tfidf_df_test = my_normalizer1.transform(Tfidf_df_test)

    svd1 = TruncatedSVD(n_components=pca_i, n_iter=7, random_state=42)
    svd1.fit(Tfidf_df_train)

    text_df_train = pd.DataFrame(svd1.transform(Tfidf_df_train))
    text_df_train = text_df_train.reset_index(drop=True)
    text_df_test = pd.DataFrame(svd1.transform(Tfidf_df_test))
    text_df_test = text_df_test.reset_index(drop=True)

    input_df_train = input_df_train.reset_index(drop=True)
    input_df_test = input_df_test.reset_index(drop=True)
    dummy_df_train = dummy_df_train.reset_index(drop=True)
    dummy_df_test = dummy_df_test.reset_index(drop=True)
    text_df_train = text_df_train.reset_index(drop=True)
    text_df_test = text_df_test.reset_index(drop=True)
    
    final_input_train = pd.concat([input_df_train[keep_vars],dummy_df_train,text_df_train], axis=1)
    final_input_test = pd.concat([input_df_test[keep_vars], dummy_df_test,text_df_test], axis=1)

    scaler = StandardScaler()
    scaler.fit(final_input_train)


    XTrain = scaler.transform(final_input_train)
    XTest = scaler.transform(final_input_test)

    return(XTrain, XTest, 
           y_train, y_test)
In [14]:
g = sns.jointplot("price_per_liter", "points", data=df2, kind="reg")
In [15]:
g = sns.residplot("price_per_liter", "points", data=df2)

Going to only use clipped price per liter.

In [16]:
g = sns.jointplot("price_per_liter_clip", "points", data=df2, kind="reg")
In [17]:
g = sns.residplot("price_per_liter_clip", "points", data=df2)
In [18]:
results = sm.OLS(df2[['points']], df2[["price_per_liter_clip"]]).fit()
print(results.summary())
                            OLS Regression Results                            
==============================================================================
Dep. Variable:                 points   R-squared:                       0.740
Model:                            OLS   Adj. R-squared:                  0.740
Method:                 Least Squares   F-statistic:                 5.969e+05
Date:                Wed, 11 Apr 2018   Prob (F-statistic):               0.00
Time:                        09:39:40   Log-Likelihood:            -1.0966e+06
No. Observations:              209895   AIC:                         2.193e+06
Df Residuals:                  209894   BIC:                         2.193e+06
Df Model:                           1                                         
Covariance Type:            nonrobust                                         
========================================================================================
                           coef    std err          t      P>|t|      [0.025      0.975]
----------------------------------------------------------------------------------------
price_per_liter_clip     1.5704      0.002    772.619      0.000       1.566       1.574
==============================================================================
Omnibus:                    26500.090   Durbin-Watson:                   0.903
Prob(Omnibus):                  0.000   Jarque-Bera (JB):            38134.316
Skew:                          -1.043   Prob(JB):                         0.00
Kurtosis:                       3.083   Cond. No.                         1.00
==============================================================================

Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
In [7]:
train_size=.2
pca_i=50
XTrain, XTest, y_train, y_test = return_model_data_points(train_size=train_size, pca_i=pca_i, input_df=df2, keep_vars=['price_per_liter_clip'])

LinearRegression

In [8]:
reg = linear_model.LinearRegression()
reg.fit (XTrain, y_train)
y_pred_train = reg.predict(XTrain)
y_pred_test = reg.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
      metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
sns.regplot(x=y_test, y=y_pred_test, fit_reg=False)
## Something is wrong...
0.2 	 50 	 0.57693157242 	 -9.60578452896e+18
Out[8]:
<matplotlib.axes._subplots.AxesSubplot at 0x14825609208>

Ridge

In [10]:
for i in [0.1,0.5,1.0,2.0,5.0,10.0,50.0,100.0]:
    reg = linear_model.Ridge(alpha=i)
    reg.fit (XTrain, y_train)
    y_pred_train = reg.predict(XTrain)
    y_pred_test = reg.predict(XTest)
    print(i,"\t",train_size,"\t",pca_i, "\t",
          metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
          metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
sns.regplot(x=y_test, y=y_pred_test, fit_reg=False)
0.1 	 0.2 	 50 	 0.576935992332 	 0.573216499941
0.5 	 0.2 	 50 	 0.576935992272 	 0.573216547326
1.0 	 0.2 	 50 	 0.576935992086 	 0.573216606453
2.0 	 0.2 	 50 	 0.576935991342 	 0.573216724366
5.0 	 0.2 	 50 	 0.576935986135 	 0.573217075359
10.0 	 0.2 	 50 	 0.576935967562 	 0.573217651208
50.0 	 0.2 	 50 	 0.576935377753 	 0.573221848744
100.0 	 0.2 	 50 	 0.5769335556 	 0.57322608157
Out[10]:
<matplotlib.axes._subplots.AxesSubplot at 0x1482c41f668>

Lasso

In [12]:
for i in [0.01,.1,.2,.3,.4,.5,.6,.7,.8,.9,1]:
    reg = linear_model.Lasso(alpha = i)
    reg.fit (XTrain, y_train)
    y_pred_train = reg.predict(XTrain)
    y_pred_test = reg.predict(XTest)
    print(i,"\t",train_size,"\t",pca_i, "\t",
          metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
          metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
0.01 	 0.2 	 50 	 0.575986066361 	 0.572419294426
0.1 	 0.2 	 50 	 0.534540295045 	 0.532605140083
0.2 	 0.2 	 50 	 0.465400718504 	 0.465050013838
0.3 	 0.2 	 50 	 0.41204104232 	 0.412654827311
0.4 	 0.2 	 50 	 0.375154557165 	 0.375882765836
0.5 	 0.2 	 50 	 0.341375800417 	 0.342083333886
0.6 	 0.2 	 50 	 0.31349541044 	 0.314543717142
0.7 	 0.2 	 50 	 0.296654285517 	 0.297816438112
0.8 	 0.2 	 50 	 0.281645241169 	 0.282813967056
0.9 	 0.2 	 50 	 0.264634990909 	 0.265790100838
1 	 0.2 	 50 	 0.245623534736 	 0.246744839458

ElasticNet

In [150]:
for i in [0.01,.1,.25,.5,.75,1]:
    for j in [0.01,.1,.2,.3,.4,.5,.6,.7,.8,.9,1]:
        reg = linear_model.ElasticNet(alpha = i, l1_ratio=j)
        reg.fit (XTrain, y_train)
        y_pred_train = reg.predict(XTrain)
        y_pred_test = reg.predict(XTest)
        print(i,"\t",j,"\t",train_size,"\t",pca_i, "\t",
              metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
              metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
0.01 	 0.01 	 0.2 	 50 	 0.576893590782 	 0.573227722896
0.01 	 0.1 	 0.2 	 50 	 0.576874012248 	 0.573223273344
0.01 	 0.2 	 0.2 	 50 	 0.576836160716 	 0.573203007261
0.01 	 0.3 	 0.2 	 50 	 0.576782260861 	 0.573166007154
0.01 	 0.4 	 0.2 	 50 	 0.576712866623 	 0.573108505886
0.01 	 0.5 	 0.2 	 50 	 0.57662785092 	 0.573028543715
0.01 	 0.6 	 0.2 	 50 	 0.576527705685 	 0.572931954568
0.01 	 0.7 	 0.2 	 50 	 0.57641349824 	 0.572820674001
0.01 	 0.8 	 0.2 	 50 	 0.576285626182 	 0.572701003893
0.01 	 0.9 	 0.2 	 50 	 0.576143311468 	 0.572567378933
0.01 	 1 	 0.2 	 50 	 0.575986066361 	 0.572419294426
0.1 	 0.01 	 0.2 	 50 	 0.573586766002 	 0.570387746534
0.1 	 0.1 	 0.2 	 50 	 0.572052832199 	 0.568965791389
0.1 	 0.2 	 0.2 	 50 	 0.56953680712 	 0.566557086036
0.1 	 0.3 	 0.2 	 50 	 0.566407192501 	 0.563546449922
0.1 	 0.4 	 0.2 	 50 	 0.562916704795 	 0.560159259596
0.1 	 0.5 	 0.2 	 50 	 0.558935070807 	 0.556339858554
0.1 	 0.6 	 0.2 	 50 	 0.554867769226 	 0.552448793168
0.1 	 0.7 	 0.2 	 50 	 0.550471925607 	 0.548247422154
0.1 	 0.8 	 0.2 	 50 	 0.545593384844 	 0.543511960024
0.1 	 0.9 	 0.2 	 50 	 0.540242390113 	 0.538237807741
0.1 	 1 	 0.2 	 50 	 0.534540295045 	 0.532605140083
0.25 	 0.01 	 0.2 	 50 	 0.560956206851 	 0.558362627354
0.25 	 0.1 	 0.2 	 50 	 0.554171869036 	 0.55186696387
0.25 	 0.2 	 0.2 	 50 	 0.544100463913 	 0.542076157546
0.25 	 0.3 	 0.2 	 50 	 0.532938789713 	 0.531297036456
0.25 	 0.4 	 0.2 	 50 	 0.520003402778 	 0.518576543496
0.25 	 0.5 	 0.2 	 50 	 0.506369162766 	 0.505187441797
0.25 	 0.6 	 0.2 	 50 	 0.491338940642 	 0.490513963865
0.25 	 0.7 	 0.2 	 50 	 0.475591121397 	 0.475010135268
0.25 	 0.8 	 0.2 	 50 	 0.460867606155 	 0.460578495342
0.25 	 0.9 	 0.2 	 50 	 0.447522316741 	 0.447591383201
0.25 	 1 	 0.2 	 50 	 0.434071691381 	 0.434407579801
0.5 	 0.01 	 0.2 	 50 	 0.531632415953 	 0.529727113449
0.5 	 0.1 	 0.2 	 50 	 0.513078584333 	 0.511756774183
0.5 	 0.2 	 0.2 	 50 	 0.488146153484 	 0.487233119019
0.5 	 0.3 	 0.2 	 50 	 0.460874375875 	 0.460428352089
0.5 	 0.4 	 0.2 	 50 	 0.43251772539 	 0.432544382297
0.5 	 0.5 	 0.2 	 50 	 0.407447128469 	 0.407875968558
0.5 	 0.6 	 0.2 	 50 	 0.390150688298 	 0.390855543368
0.5 	 0.7 	 0.2 	 50 	 0.37607742137 	 0.376905862358
0.5 	 0.8 	 0.2 	 50 	 0.363335887349 	 0.36415136173
0.5 	 0.9 	 0.2 	 50 	 0.351284175433 	 0.352036738446
0.5 	 1 	 0.2 	 50 	 0.341375800417 	 0.342083333886
0.75 	 0.01 	 0.2 	 50 	 0.5000680953 	 0.49862210343
0.75 	 0.1 	 0.2 	 50 	 0.468705856044 	 0.4680519622
0.75 	 0.2 	 0.2 	 50 	 0.428354547165 	 0.428127083416
0.75 	 0.3 	 0.2 	 50 	 0.389002502857 	 0.389430006449
0.75 	 0.4 	 0.2 	 50 	 0.359283176887 	 0.360000845263
0.75 	 0.5 	 0.2 	 50 	 0.338084260705 	 0.338988115747
0.75 	 0.6 	 0.2 	 50 	 0.320198965114 	 0.32104493684
0.75 	 0.7 	 0.2 	 50 	 0.30575751996 	 0.30656098838
0.75 	 0.8 	 0.2 	 50 	 0.294524347417 	 0.295538086543
0.75 	 0.9 	 0.2 	 50 	 0.288255945038 	 0.289424348368
0.75 	 1 	 0.2 	 50 	 0.289399914082 	 0.290567876979
1 	 0.01 	 0.2 	 50 	 0.469333439471 	 0.468228121803
1 	 0.1 	 0.2 	 50 	 0.424860926947 	 0.424629938203
1 	 0.2 	 0.2 	 50 	 0.372371666093 	 0.372645677258
1 	 0.3 	 0.2 	 50 	 0.331183202344 	 0.331962881148
1 	 0.4 	 0.2 	 50 	 0.301875160719 	 0.302791550508
1 	 0.5 	 0.2 	 50 	 0.279705160637 	 0.280499203856
1 	 0.6 	 0.2 	 50 	 0.263923027326 	 0.264855098033
1 	 0.7 	 0.2 	 50 	 0.253498906987 	 0.254603629933
1 	 0.8 	 0.2 	 50 	 0.250280972551 	 0.251412056017
1 	 0.9 	 0.2 	 50 	 0.248177714676 	 0.24930449207
1 	 1 	 0.2 	 50 	 0.245623534736 	 0.246744839458

BayesianRidge

In [13]:
reg = linear_model.BayesianRidge()
reg.fit (XTrain, np.ravel(y_train))
y_pred_train = reg.predict(XTrain)
y_pred_test = reg.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
      metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
sns.regplot(x=y_test, y=y_pred_test, fit_reg=False)
0.2 	 50 	 0.576933697755 	 0.57322586024
Out[13]:
<matplotlib.axes._subplots.AxesSubplot at 0x1482b2d9860>

Epsilon-Support Vector Regression

In [14]:
reg = svm.SVR()
reg.fit (XTrain, np.ravel(y_train))
y_pred_train = reg.predict(XTrain)
y_pred_test = reg.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
      metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
sns.regplot(x=y_test, y=y_pred_test, fit_reg=False)
0.2 	 50 	 0.70426790027 	 0.656844115156
Out[14]:
<matplotlib.axes._subplots.AxesSubplot at 0x1482a0edb00>

KNeighborsRegressor

In [15]:
reg = KNeighborsRegressor(n_neighbors=5)
reg.fit (XTrain, np.ravel(y_train))
y_pred_train = reg.predict(XTrain)
y_pred_test = reg.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
      metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
sns.regplot(x=y_test, y=y_pred_test, fit_reg=False)
0.2 	 50 	 0.622527718053 	 0.432821996652
Out[15]:
<matplotlib.axes._subplots.AxesSubplot at 0x1482a2352e8>

DecisionTreeRegressor

In [16]:
for i  in [1,2,3,4,5,6,7,8,9,10,11,12,13,14]:
    reg = DecisionTreeRegressor(max_depth=i)
    reg.fit (XTrain, np.ravel(y_train))
    y_pred_train = reg.predict(XTrain)
    y_pred_test = reg.predict(XTest)
    print(i,"\t",train_size,"\t",pca_i, "\t",
          metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
          metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
1 	 0.2 	 50 	 0.251932279139 	 0.249152937977
2 	 0.2 	 50 	 0.341400724192 	 0.338789311876
3 	 0.2 	 50 	 0.373823694253 	 0.367008755166
4 	 0.2 	 50 	 0.408825435999 	 0.398285452377
5 	 0.2 	 50 	 0.436963415561 	 0.418303149015
6 	 0.2 	 50 	 0.464584350532 	 0.435764443706
7 	 0.2 	 50 	 0.492742686219 	 0.444242915314
8 	 0.2 	 50 	 0.524234486312 	 0.445670237202
9 	 0.2 	 50 	 0.561980925478 	 0.434492809543
10 	 0.2 	 50 	 0.606457965345 	 0.410860021444
11 	 0.2 	 50 	 0.659587782819 	 0.376260786637
12 	 0.2 	 50 	 0.716714866706 	 0.328845918328
13 	 0.2 	 50 	 0.773351091464 	 0.2791865206
14 	 0.2 	 50 	 0.825830874601 	 0.23339134123

ExtraTreesRegressor

In [17]:
for i  in [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20]:
    reg = ExtraTreesRegressor(max_depth=i)
    reg.fit (XTrain, np.ravel(y_train))
    y_pred_train = reg.predict(XTrain)
    y_pred_test = reg.predict(XTest)
    print(i,"\t",train_size,"\t",pca_i, "\t",
          metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
          metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
sns.regplot(x=y_test, y=y_pred_test, fit_reg=False)
1 	 0.2 	 50 	 0.250702831543 	 0.252233909912
2 	 0.2 	 50 	 0.346014760001 	 0.347410376953
3 	 0.2 	 50 	 0.373042889894 	 0.372361558402
4 	 0.2 	 50 	 0.397748125722 	 0.395700369936
5 	 0.2 	 50 	 0.424961996697 	 0.419465378583
6 	 0.2 	 50 	 0.445479095931 	 0.43676927351
7 	 0.2 	 50 	 0.483306877972 	 0.465467119964
8 	 0.2 	 50 	 0.503266109002 	 0.47527392397
9 	 0.2 	 50 	 0.540374549472 	 0.492036931301
10 	 0.2 	 50 	 0.576488451291 	 0.503864514768
11 	 0.2 	 50 	 0.608078529371 	 0.509295065791
12 	 0.2 	 50 	 0.667721131844 	 0.521632938154
13 	 0.2 	 50 	 0.708395080379 	 0.528899576287
14 	 0.2 	 50 	 0.773132969264 	 0.534209436904
15 	 0.2 	 50 	 0.808661634145 	 0.53594828409
16 	 0.2 	 50 	 0.848451699881 	 0.536891977371
17 	 0.2 	 50 	 0.895176520013 	 0.536335558463
18 	 0.2 	 50 	 0.91931308845 	 0.532624971956
19 	 0.2 	 50 	 0.948828666178 	 0.53472422295
20 	 0.2 	 50 	 0.96453404874 	 0.535486473036
Out[17]:
<matplotlib.axes._subplots.AxesSubplot at 0x1482a695f60>

RandomForestRegressor

In [18]:
for i  in [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20]:
    reg = RandomForestRegressor(max_depth=i)
    reg.fit (XTrain, np.ravel(y_train))
    y_pred_train = reg.predict(XTrain)
    y_pred_test = reg.predict(XTest)
    print(i,"\t",train_size,"\t",pca_i, "\t",
          metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
          metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
sns.regplot(x=y_test, y=y_pred_test, fit_reg=False)
1 	 0.2 	 50 	 0.256239624072 	 0.253624945253
2 	 0.2 	 50 	 0.347948655748 	 0.346043808747
3 	 0.2 	 50 	 0.388934626338 	 0.382918891847
4 	 0.2 	 50 	 0.424056418 	 0.414801134982
5 	 0.2 	 50 	 0.461091626527 	 0.445212117087
6 	 0.2 	 50 	 0.495377842338 	 0.471388787276
7 	 0.2 	 50 	 0.52918087168 	 0.490001754357
8 	 0.2 	 50 	 0.566430208658 	 0.503649795119
9 	 0.2 	 50 	 0.609478815003 	 0.516432159062
10 	 0.2 	 50 	 0.659313456614 	 0.524709348032
11 	 0.2 	 50 	 0.708615397698 	 0.530747357945
12 	 0.2 	 50 	 0.752191917314 	 0.531263426368
13 	 0.2 	 50 	 0.795701892029 	 0.530907021566
14 	 0.2 	 50 	 0.828366796496 	 0.528625600445
15 	 0.2 	 50 	 0.85394667325 	 0.529896731915
16 	 0.2 	 50 	 0.872910575089 	 0.525058322352
17 	 0.2 	 50 	 0.885542811777 	 0.526667821111
18 	 0.2 	 50 	 0.89755475626 	 0.525076414466
19 	 0.2 	 50 	 0.904596780879 	 0.520420874364
20 	 0.2 	 50 	 0.906148294718 	 0.520417852396
Out[18]:
<matplotlib.axes._subplots.AxesSubplot at 0x1482b1a2b70>

BaggingRegressor

In [19]:
reg = BaggingRegressor()
reg.fit (XTrain, np.ravel(y_train))
y_pred_train = reg.predict(XTrain)
y_pred_test = reg.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
      metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
sns.regplot(x=y_test, y=y_pred_test, fit_reg=False)
0.2 	 50 	 0.912382812037 	 0.521644487498
Out[19]:
<matplotlib.axes._subplots.AxesSubplot at 0x1482b2915f8>

AdaBoostRegressor

In [20]:
reg = AdaBoostRegressor()
reg.fit (XTrain, np.ravel(y_train))
y_pred_train = reg.predict(XTrain)
y_pred_test = reg.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
      metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
sns.regplot(x=y_test, y=y_pred_test, fit_reg=False)
0.2 	 50 	 0.471425117608 	 0.463359709722
Out[20]:
<matplotlib.axes._subplots.AxesSubplot at 0x1482b9b7be0>

GradientBoostingRegressor

In [21]:
reg = GradientBoostingRegressor()
reg.fit (XTrain, np.ravel(y_train))
y_pred_train = reg.predict(XTrain)
y_pred_test = reg.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
      metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
sns.regplot(x=y_test, y=y_pred_test, fit_reg=False)
0.2 	 50 	 0.596213044953 	 0.570234555362
Out[21]:
<matplotlib.axes._subplots.AxesSubplot at 0x1482bb187b8>

MLPRegressor

In [22]:
reg = MLPRegressor()
reg.fit (XTrain, np.ravel(y_train))
y_pred_train = reg.predict(XTrain)
y_pred_test = reg.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
      metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
sns.regplot(x=y_test, y=y_pred_test, fit_reg=False)
0.2 	 50 	 0.681244331661 	 0.588978218664
Out[22]:
<matplotlib.axes._subplots.AxesSubplot at 0x1482baff3c8>

MLPRegressor Scaled

In [23]:
scalery = StandardScaler()
scalery.fit(y_train)

reg = MLPRegressor( activation="logistic")
reg.fit (XTrain, np.ravel(scalery.transform(y_train)))
y_pred_train = scalery.inverse_transform(reg.predict(XTrain))
y_pred_test = scalery.inverse_transform(reg.predict(XTest))
print(train_size,"\t",pca_i, "\t",
      metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
      metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
sns.regplot(x=y_test, y=y_pred_test, fit_reg=False)
0.2 	 50 	 0.771300238183 	 0.607682357428
Out[23]:
<matplotlib.axes._subplots.AxesSubplot at 0x1482c4a26d8>

Create interactive chart

Must rerun for interactive chart!

In [3]:
%%javascript
// Since I append the div later, sometimes there are multiple divs.
$("#container0").remove();

// Make the cdiv to contain the chart.
element.append('<div id="container0" style="min-width: 310px; height: 400px; margin: 0 auto"></div>');

// Require highcarts and make the chart.
require(['highcharts_exports'], function(Highcharts) {
    $('#container0').highcharts({
        title: {
        text: 'Regression on 20% Train'
    },
        plotOptions: {
            scatter: {
                dataLabels: {
                    format: "{point.name}",
                    enabled: true
                },
                
                enableMouseTracking: false
            }
        },
        
        yAxis: {
        title: {
            text: 'test'
        }
    },xAxis: {
        title: {
            text: 'train'
        }
    },
       
        legend: {
            enabled: false
        },
        series: [{name:'LinearRegression',data:[[0.57693157242,0]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Ridge a-0.1',data:[[0.576935992332,0.573216499941]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Ridge a-0.5',data:[[0.576935992272,0.573216547326]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Ridge a-1',data:[[0.576935992086,0.573216606453]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Ridge a-2',data:[[0.576935991342,0.573216724366]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Ridge a-5',data:[[0.576935986135,0.573217075359]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Ridge a-10',data:[[0.576935967562,0.573217651208]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Ridge a-50',data:[[0.576935377753,0.573221848744]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Ridge a-100',data:[[0.5769335556,0.57322608157]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Lasso a-0.01',data:[[0.575986066361,0.572419294426]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Lasso a-0.1',data:[[0.534540295045,0.532605140083]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Lasso a-0.2',data:[[0.465400718504,0.465050013838]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Lasso a-0.3',data:[[0.41204104232,0.412654827311]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Lasso a-0.4',data:[[0.375154557165,0.375882765836]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Lasso a-0.5',data:[[0.341375800417,0.342083333886]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Lasso a-0.6',data:[[0.31349541044,0.314543717142]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Lasso a-0.7',data:[[0.296654285517,0.297816438112]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Lasso a-0.8',data:[[0.281645241169,0.282813967056]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Lasso a-0.9',data:[[0.264634990909,0.265790100838]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Lasso a-1',data:[[0.245623534736,0.246744839458]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ElasticNet',data:[[0.576893590782,0.573227722896]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'BayesianRidge',data:[[0.576933697755,0.57322586024]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'Epsilon-Support Vector Regression',data:[[0.70426790027,0.656844115156]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'KNeighborsRegressor',data:[[0.622527718053,0.432821996652]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-1',data:[[0.251932279139,0.249152937977]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-2',data:[[0.341400724192,0.338789311876]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-3',data:[[0.373823694253,0.367008755166]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-4',data:[[0.408825435999,0.398285452377]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-5',data:[[0.436963415561,0.418303149015]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-6',data:[[0.464584350532,0.435764443706]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-7',data:[[0.492742686219,0.444242915314]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-8',data:[[0.524234486312,0.445643770674]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-9',data:[[0.561980925478,0.434528450405]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-10',data:[[0.60646717857,0.409553241429]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-11',data:[[0.659616755222,0.377614687389]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-12',data:[[0.716702294151,0.331247997956]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-13',data:[[0.773246101054,0.282921586019]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'DecisionTreeRegressor md-14',data:[[0.826168052945,0.232335639987]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-1',data:[[0.276218453289,0.277302859305]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-2',data:[[0.334372146946,0.334800958001]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-3',data:[[0.341710864848,0.342158171534]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-4',data:[[0.395177437638,0.393018122047]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-5',data:[[0.418316846897,0.412944158794]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-6',data:[[0.433999979819,0.425637676073]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-7',data:[[0.47331644118,0.457239340214]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-8',data:[[0.503824204823,0.47544996857]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-9',data:[[0.536455051507,0.487497203427]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-10',data:[[0.573390642294,0.502803734809]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-11',data:[[0.614982125512,0.513066632459]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-12',data:[[0.648900220316,0.517159108847]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-13',data:[[0.705539937704,0.525858012936]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-14',data:[[0.75273761587,0.530661565234]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-15',data:[[0.824681855177,0.536727670344]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-16',data:[[0.839869737177,0.534983660466]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-17',data:[[0.880005192681,0.537550346258]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-18',data:[[0.917548922837,0.534934652633]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-19',data:[[0.947038604116,0.535134292795]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ExtraTreesRegressor md-20',data:[[0.961746205899,0.535312367497]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-1',data:[[0.251932190003,0.249150563766]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-2',data:[[0.345690387606,0.343724499419]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-3',data:[[0.385052276977,0.379435051034]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-4',data:[[0.425255082027,0.415273738603]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-5',data:[[0.462670423378,0.446221157983]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-6',data:[[0.493356006109,0.467804473027]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-7',data:[[0.529782851151,0.490420897987]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-8',data:[[0.567668710916,0.505424908231]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-9',data:[[0.611066100862,0.517659696631]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-10',data:[[0.660582133165,0.524881316839]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-11',data:[[0.704588271716,0.527043811798]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-12',data:[[0.751068808656,0.531196757332]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-13',data:[[0.792973665758,0.530265449309]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-14',data:[[0.832133500033,0.527065076469]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-15',data:[[0.852528706076,0.526638200629]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-16',data:[[0.873256080887,0.528913301794]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-17',data:[[0.887194127046,0.524580152187]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-18',data:[[0.897051010579,0.521906266137]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-19',data:[[0.903416142359,0.520207740243]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'RandomForestRegressor md-20',data:[[0.907977793058,0.52126633392]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'BaggingRegressor',data:[[0.913977685751,0.52026010201]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'AdaBoostRegressor',data:[[0.472260117792,0.462546681236]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'GradientBoostingRegressor',data:[[0.596213044953,0.570234555362]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'MLPRegressor',data:[[0.672571071177,0.586911674866]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'MLPRegressor Scaled',data:[[0.774730419101,0.604890555792]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}}]
    });
});

SVM does the best across all methods.

Test different parameters for GradientBoostingRegressor

In [28]:
train_size=.05
pca_i=100
XTrain, XTest, y_train, y_test = return_model_data_points(train_size=train_size, pca_i=pca_i, input_df=df2, keep_vars=['price_per_liter_clip'])
In [10]:
loss_list=['ls','lad','huber','quantile']
learning_rate_list = [0.1,0.15,0.2,.25,.5,.75]
max_depth_list = [1,2,3,4,5,6,7,8,9]
for loss_i in loss_list:
    for learning_rate_i in learning_rate_list:
        for max_depth_i in max_depth_list:
            reg = GradientBoostingRegressor(loss=loss_i,learning_rate=learning_rate_i, max_depth=max_depth_i)
            reg.fit (XTrain, np.ravel(y_train))
            y_pred_train = reg.predict(XTrain)
            y_pred_test = reg.predict(XTest)
            print(loss_i,"\t",learning_rate_i,"\t",max_depth_i,"\t",train_size,"\t",pca_i, "\t",
                  metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
                  metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
ls 	 0.1 	 1 	 0.05 	 100 	 0.483195481072 	 0.474670631554
ls 	 0.1 	 2 	 0.05 	 100 	 0.573312148488 	 0.542484573263
ls 	 0.1 	 3 	 0.05 	 100 	 0.639864359097 	 0.572574474065
ls 	 0.1 	 4 	 0.05 	 100 	 0.711570113162 	 0.591042550157
ls 	 0.1 	 5 	 0.05 	 100 	 0.788853944671 	 0.597958416133
ls 	 0.1 	 6 	 0.05 	 100 	 0.864170521402 	 0.599622857668
ls 	 0.1 	 7 	 0.05 	 100 	 0.930424766073 	 0.596030158475
ls 	 0.1 	 8 	 0.05 	 100 	 0.968269131224 	 0.587936201096
ls 	 0.1 	 9 	 0.05 	 100 	 0.991099822719 	 0.57738851849
ls 	 0.15 	 1 	 0.05 	 100 	 0.519173600994 	 0.506530808563
ls 	 0.15 	 2 	 0.05 	 100 	 0.606877851635 	 0.564962613525
ls 	 0.15 	 3 	 0.05 	 100 	 0.678489135064 	 0.589166602604
ls 	 0.15 	 4 	 0.05 	 100 	 0.754201228068 	 0.59949633042
ls 	 0.15 	 5 	 0.05 	 100 	 0.830949703617 	 0.601950337646
ls 	 0.15 	 6 	 0.05 	 100 	 0.897615111022 	 0.5980906965
ls 	 0.15 	 7 	 0.05 	 100 	 0.955596525332 	 0.590033544289
ls 	 0.15 	 8 	 0.05 	 100 	 0.983457954212 	 0.579793165364
ls 	 0.15 	 9 	 0.05 	 100 	 0.997128038571 	 0.567213632333
ls 	 0.2 	 1 	 0.05 	 100 	 0.542924522414 	 0.526893958616
ls 	 0.2 	 2 	 0.05 	 100 	 0.629785050773 	 0.578424459603
ls 	 0.2 	 3 	 0.05 	 100 	 0.703825126206 	 0.595344980637
ls 	 0.2 	 4 	 0.05 	 100 	 0.78146733493 	 0.601375735172
ls 	 0.2 	 5 	 0.05 	 100 	 0.855945803152 	 0.595180303543
ls 	 0.2 	 6 	 0.05 	 100 	 0.922226987158 	 0.590351006137
ls 	 0.2 	 7 	 0.05 	 100 	 0.969728483677 	 0.580146372507
ls 	 0.2 	 8 	 0.05 	 100 	 0.990807379293 	 0.568193397591
ls 	 0.2 	 9 	 0.05 	 100 	 0.998564304007 	 0.552465567057
ls 	 0.25 	 1 	 0.05 	 100 	 0.559405555776 	 0.540754120087
ls 	 0.25 	 2 	 0.05 	 100 	 0.64650911693 	 0.586309813089
ls 	 0.25 	 3 	 0.05 	 100 	 0.720777611522 	 0.598429031394
ls 	 0.25 	 4 	 0.05 	 100 	 0.800040485961 	 0.59607520576
ls 	 0.25 	 5 	 0.05 	 100 	 0.877771541174 	 0.590470901025
ls 	 0.25 	 6 	 0.05 	 100 	 0.939282385401 	 0.578703560005
ls 	 0.25 	 7 	 0.05 	 100 	 0.977229088685 	 0.567272206962
ls 	 0.25 	 8 	 0.05 	 100 	 0.995125947476 	 0.554312828629
ls 	 0.25 	 9 	 0.05 	 100 	 0.99955848554 	 0.540051802073
ls 	 0.5 	 1 	 0.05 	 100 	 0.597612520156 	 0.563577973277
ls 	 0.5 	 2 	 0.05 	 100 	 0.681079881294 	 0.582230505545
ls 	 0.5 	 3 	 0.05 	 100 	 0.761461322512 	 0.57405556384
ls 	 0.5 	 4 	 0.05 	 100 	 0.849517402345 	 0.544959603289
ls 	 0.5 	 5 	 0.05 	 100 	 0.928608187651 	 0.514958440526
ls 	 0.5 	 6 	 0.05 	 100 	 0.977741120904 	 0.492438416477
ls 	 0.5 	 7 	 0.05 	 100 	 0.995730500572 	 0.474708972368
ls 	 0.5 	 8 	 0.05 	 100 	 0.999773201584 	 0.461162568518
ls 	 0.5 	 9 	 0.05 	 100 	 0.999991519654 	 0.445805099294
ls 	 0.75 	 1 	 0.05 	 100 	 0.606852019672 	 0.561684732504
ls 	 0.75 	 2 	 0.05 	 100 	 0.691586078645 	 0.56485422784
ls 	 0.75 	 3 	 0.05 	 100 	 0.778952150523 	 0.512579454393
ls 	 0.75 	 4 	 0.05 	 100 	 0.877017801221 	 0.462789396318
ls 	 0.75 	 5 	 0.05 	 100 	 0.954800912462 	 0.400535774188
ls 	 0.75 	 6 	 0.05 	 100 	 0.991349439505 	 0.359251139115
ls 	 0.75 	 7 	 0.05 	 100 	 0.999104622718 	 0.338249867878
ls 	 0.75 	 8 	 0.05 	 100 	 0.999969242334 	 0.320387508687
ls 	 0.75 	 9 	 0.05 	 100 	 0.999999723331 	 0.309320133057
lad 	 0.1 	 1 	 0.05 	 100 	 0.470147244234 	 0.463792116696
lad 	 0.1 	 2 	 0.05 	 100 	 0.552889121664 	 0.53014259963
lad 	 0.1 	 3 	 0.05 	 100 	 0.61240896779 	 0.56339443216
lad 	 0.1 	 4 	 0.05 	 100 	 0.660189367638 	 0.574401693833
lad 	 0.1 	 5 	 0.05 	 100 	 0.708718359382 	 0.581121909447
lad 	 0.1 	 6 	 0.05 	 100 	 0.753550867917 	 0.583106521626
lad 	 0.1 	 7 	 0.05 	 100 	 0.79126941866 	 0.577200902192
lad 	 0.1 	 8 	 0.05 	 100 	 0.827173658816 	 0.572023240758
lad 	 0.1 	 9 	 0.05 	 100 	 0.86302982277 	 0.567393604828
lad 	 0.15 	 1 	 0.05 	 100 	 0.50531548943 	 0.495426880447
lad 	 0.15 	 2 	 0.05 	 100 	 0.584173448482 	 0.553337009746
lad 	 0.15 	 3 	 0.05 	 100 	 0.638287211259 	 0.575925501813
lad 	 0.15 	 4 	 0.05 	 100 	 0.686871547205 	 0.583072692261
lad 	 0.15 	 5 	 0.05 	 100 	 0.731838475214 	 0.584479040752
lad 	 0.15 	 6 	 0.05 	 100 	 0.777482549972 	 0.583185647494
lad 	 0.15 	 7 	 0.05 	 100 	 0.815524920058 	 0.579326200172
lad 	 0.15 	 8 	 0.05 	 100 	 0.8474383006 	 0.571369892454
lad 	 0.15 	 9 	 0.05 	 100 	 0.875936489803 	 0.56192039182
lad 	 0.2 	 1 	 0.05 	 100 	 0.526634366283 	 0.513101736847
lad 	 0.2 	 2 	 0.05 	 100 	 0.601853945648 	 0.564377943386
lad 	 0.2 	 3 	 0.05 	 100 	 0.657019346919 	 0.583643442889
lad 	 0.2 	 4 	 0.05 	 100 	 0.699606843876 	 0.583923832298
lad 	 0.2 	 5 	 0.05 	 100 	 0.746909026349 	 0.583862672448
lad 	 0.2 	 6 	 0.05 	 100 	 0.784414301077 	 0.580506250823
lad 	 0.2 	 7 	 0.05 	 100 	 0.825115258714 	 0.575212067096
lad 	 0.2 	 8 	 0.05 	 100 	 0.855249257636 	 0.563140362391
lad 	 0.2 	 9 	 0.05 	 100 	 0.887213285797 	 0.555947943111
lad 	 0.25 	 1 	 0.05 	 100 	 0.544964643176 	 0.529631214942
lad 	 0.25 	 2 	 0.05 	 100 	 0.614969012229 	 0.571436672092
lad 	 0.25 	 3 	 0.05 	 100 	 0.666229813029 	 0.584058564796
lad 	 0.25 	 4 	 0.05 	 100 	 0.711360297902 	 0.581802582589
lad 	 0.25 	 5 	 0.05 	 100 	 0.752913444559 	 0.579061446238
lad 	 0.25 	 6 	 0.05 	 100 	 0.784434659618 	 0.5741186018
lad 	 0.25 	 7 	 0.05 	 100 	 0.824056919477 	 0.567506882236
lad 	 0.25 	 8 	 0.05 	 100 	 0.861304393256 	 0.556674638704
lad 	 0.25 	 9 	 0.05 	 100 	 0.886159913456 	 0.548212034485
lad 	 0.5 	 1 	 0.05 	 100 	 0.541070180523 	 0.520994250388
lad 	 0.5 	 2 	 0.05 	 100 	 0.647893425537 	 0.577629536759
lad 	 0.5 	 3 	 0.05 	 100 	 0.688345597803 	 0.57000271842
lad 	 0.5 	 4 	 0.05 	 100 	 0.733897399816 	 0.555695882275
lad 	 0.5 	 5 	 0.05 	 100 	 0.769537623812 	 0.547997992185
lad 	 0.5 	 6 	 0.05 	 100 	 0.807249667709 	 0.533532324904
lad 	 0.5 	 7 	 0.05 	 100 	 0.835757890632 	 0.518192532309
lad 	 0.5 	 8 	 0.05 	 100 	 0.87284285622 	 0.498597683566
lad 	 0.5 	 9 	 0.05 	 100 	 0.896917529415 	 0.484213472853
lad 	 0.75 	 1 	 0.05 	 100 	 0.596317283023 	 0.556183736385
lad 	 0.75 	 2 	 0.05 	 100 	 0.652990330282 	 0.563488121362
lad 	 0.75 	 3 	 0.05 	 100 	 0.698452223812 	 0.54571542435
lad 	 0.75 	 4 	 0.05 	 100 	 0.731143277028 	 0.512059059729
lad 	 0.75 	 5 	 0.05 	 100 	 0.782148045705 	 0.486256502673
lad 	 0.75 	 6 	 0.05 	 100 	 0.811386651861 	 0.45511620425
lad 	 0.75 	 7 	 0.05 	 100 	 0.843396315063 	 0.439166826033
lad 	 0.75 	 8 	 0.05 	 100 	 0.879474561023 	 0.412184112483
lad 	 0.75 	 9 	 0.05 	 100 	 0.897729340138 	 0.390995879317
huber 	 0.1 	 1 	 0.05 	 100 	 0.484702820443 	 0.476256697828
huber 	 0.1 	 2 	 0.05 	 100 	 0.572787279837 	 0.543573328207
huber 	 0.1 	 3 	 0.05 	 100 	 0.636429581541 	 0.572089154825
huber 	 0.1 	 4 	 0.05 	 100 	 0.705626481701 	 0.58987898467
huber 	 0.1 	 5 	 0.05 	 100 	 0.782201997196 	 0.597765022006
huber 	 0.1 	 6 	 0.05 	 100 	 0.852819556298 	 0.598969575974
huber 	 0.1 	 7 	 0.05 	 100 	 0.914158427534 	 0.595389577059
huber 	 0.1 	 8 	 0.05 	 100 	 0.955300038064 	 0.586869064995
huber 	 0.1 	 9 	 0.05 	 100 	 0.980411857223 	 0.577367233848
huber 	 0.15 	 1 	 0.05 	 100 	 0.519941627966 	 0.507897862138
huber 	 0.15 	 2 	 0.05 	 100 	 0.607327118468 	 0.566325242824
huber 	 0.15 	 3 	 0.05 	 100 	 0.676699207771 	 0.589211731403
huber 	 0.15 	 4 	 0.05 	 100 	 0.746618168939 	 0.59665550598
huber 	 0.15 	 5 	 0.05 	 100 	 0.821272306105 	 0.60081752492
huber 	 0.15 	 6 	 0.05 	 100 	 0.885438298961 	 0.59769842929
huber 	 0.15 	 7 	 0.05 	 100 	 0.937023503168 	 0.590278029409
huber 	 0.15 	 8 	 0.05 	 100 	 0.970426747868 	 0.581065009216
huber 	 0.15 	 9 	 0.05 	 100 	 0.986964960927 	 0.566564430631
huber 	 0.2 	 1 	 0.05 	 100 	 0.54256712556 	 0.528182412243
huber 	 0.2 	 2 	 0.05 	 100 	 0.628663643289 	 0.578246036712
huber 	 0.2 	 3 	 0.05 	 100 	 0.699652365591 	 0.596663826998
huber 	 0.2 	 4 	 0.05 	 100 	 0.770187654079 	 0.598169269974
huber 	 0.2 	 5 	 0.05 	 100 	 0.844262425768 	 0.596027464119
huber 	 0.2 	 6 	 0.05 	 100 	 0.905089969876 	 0.589218799269
huber 	 0.2 	 7 	 0.05 	 100 	 0.954889724486 	 0.579975597426
huber 	 0.2 	 8 	 0.05 	 100 	 0.979431370891 	 0.567294999257
huber 	 0.2 	 9 	 0.05 	 100 	 0.991727830882 	 0.554679291695
huber 	 0.25 	 1 	 0.05 	 100 	 0.559720657401 	 0.541083448514
huber 	 0.25 	 2 	 0.05 	 100 	 0.644148356014 	 0.585851345331
huber 	 0.25 	 3 	 0.05 	 100 	 0.717523746548 	 0.598188059469
huber 	 0.25 	 4 	 0.05 	 100 	 0.787801596543 	 0.595209847844
huber 	 0.25 	 5 	 0.05 	 100 	 0.863754625038 	 0.589298549984
huber 	 0.25 	 6 	 0.05 	 100 	 0.921083919934 	 0.579992048102
huber 	 0.25 	 7 	 0.05 	 100 	 0.965034691734 	 0.568629367988
huber 	 0.25 	 8 	 0.05 	 100 	 0.983509172086 	 0.556772124846
huber 	 0.25 	 9 	 0.05 	 100 	 0.993460904396 	 0.541221352857
huber 	 0.5 	 1 	 0.05 	 100 	 0.595676165738 	 0.564154299715
huber 	 0.5 	 2 	 0.05 	 100 	 0.678149425592 	 0.583144684586
huber 	 0.5 	 3 	 0.05 	 100 	 0.758631929306 	 0.575104151827
huber 	 0.5 	 4 	 0.05 	 100 	 0.835414885511 	 0.550314716196
huber 	 0.5 	 5 	 0.05 	 100 	 0.909606389212 	 0.524410034328
huber 	 0.5 	 6 	 0.05 	 100 	 0.963297462163 	 0.502234861715
huber 	 0.5 	 7 	 0.05 	 100 	 0.984285631928 	 0.484700504454
huber 	 0.5 	 8 	 0.05 	 100 	 0.992078885691 	 0.47157913527
huber 	 0.5 	 9 	 0.05 	 100 	 0.996951070307 	 0.45165522074
huber 	 0.75 	 1 	 0.05 	 100 	 0.606945820357 	 0.559035997392
huber 	 0.75 	 2 	 0.05 	 100 	 0.689569079414 	 0.558096043278
huber 	 0.75 	 3 	 0.05 	 100 	 0.774867188469 	 0.520734366421
huber 	 0.75 	 4 	 0.05 	 100 	 0.859730039591 	 0.46570304241
huber 	 0.75 	 5 	 0.05 	 100 	 0.936316030797 	 0.410348470649
huber 	 0.75 	 6 	 0.05 	 100 	 0.977501177396 	 0.381674699951
huber 	 0.75 	 7 	 0.05 	 100 	 0.987516730978 	 0.365960005809
huber 	 0.75 	 8 	 0.05 	 100 	 0.992953428422 	 0.344039423157
huber 	 0.75 	 9 	 0.05 	 100 	 0.997207734394 	 0.320933042798
quantile 	 0.1 	 1 	 0.05 	 100 	 0.423238318883 	 0.424403863531
quantile 	 0.1 	 2 	 0.05 	 100 	 0.497303156164 	 0.484234218165
quantile 	 0.1 	 3 	 0.05 	 100 	 0.533069482006 	 0.507447726265
quantile 	 0.1 	 4 	 0.05 	 100 	 0.55817468672 	 0.516466130494
quantile 	 0.1 	 5 	 0.05 	 100 	 0.57573691731 	 0.517066136644
quantile 	 0.1 	 6 	 0.05 	 100 	 0.587564188439 	 0.512729279197
quantile 	 0.1 	 7 	 0.05 	 100 	 0.593097890283 	 0.501817690723
quantile 	 0.1 	 8 	 0.05 	 100 	 0.610329728067 	 0.500073852524
quantile 	 0.1 	 9 	 0.05 	 100 	 0.608361601864 	 0.487795052256
quantile 	 0.15 	 1 	 0.05 	 100 	 0.45620187398 	 0.454324648218
quantile 	 0.15 	 2 	 0.05 	 100 	 0.520808531602 	 0.502712682075
quantile 	 0.15 	 3 	 0.05 	 100 	 0.556432890711 	 0.52328556955
quantile 	 0.15 	 4 	 0.05 	 100 	 0.575624901586 	 0.524978122751
quantile 	 0.15 	 5 	 0.05 	 100 	 0.589029428191 	 0.52458893886
quantile 	 0.15 	 6 	 0.05 	 100 	 0.601383893006 	 0.517931405175
quantile 	 0.15 	 7 	 0.05 	 100 	 0.620786448514 	 0.515524550746
quantile 	 0.15 	 8 	 0.05 	 100 	 0.623015806904 	 0.502629595446
quantile 	 0.15 	 9 	 0.05 	 100 	 0.62340850872 	 0.491110477043
quantile 	 0.2 	 1 	 0.05 	 100 	 0.476672151201 	 0.471948670592
quantile 	 0.2 	 2 	 0.05 	 100 	 0.537850554478 	 0.514182800628
quantile 	 0.2 	 3 	 0.05 	 100 	 0.561246711192 	 0.524261876705
quantile 	 0.2 	 4 	 0.05 	 100 	 0.593101129083 	 0.533834121163
quantile 	 0.2 	 5 	 0.05 	 100 	 0.605721657923 	 0.530259911835
quantile 	 0.2 	 6 	 0.05 	 100 	 0.616357172131 	 0.524147428767
quantile 	 0.2 	 7 	 0.05 	 100 	 0.627130242245 	 0.516334188837
quantile 	 0.2 	 8 	 0.05 	 100 	 0.64259263376 	 0.510422616685
quantile 	 0.2 	 9 	 0.05 	 100 	 0.645490438547 	 0.501146966844
quantile 	 0.25 	 1 	 0.05 	 100 	 0.491012773207 	 0.485374274742
quantile 	 0.25 	 2 	 0.05 	 100 	 0.54531751435 	 0.519866482481
quantile 	 0.25 	 3 	 0.05 	 100 	 0.575786948041 	 0.532538708395
quantile 	 0.25 	 4 	 0.05 	 100 	 0.597126659958 	 0.530261221923
quantile 	 0.25 	 5 	 0.05 	 100 	 0.605907437887 	 0.528458734123
quantile 	 0.25 	 6 	 0.05 	 100 	 0.628726805734 	 0.52590127478
quantile 	 0.25 	 7 	 0.05 	 100 	 0.639785679312 	 0.519928756854
quantile 	 0.25 	 8 	 0.05 	 100 	 0.638370522989 	 0.504382082233
quantile 	 0.25 	 9 	 0.05 	 100 	 0.654651683521 	 0.498127846258
quantile 	 0.5 	 1 	 0.05 	 100 	 0.532728180733 	 0.517173859921
quantile 	 0.5 	 2 	 0.05 	 100 	 0.55852769516 	 0.522711256208
quantile 	 0.5 	 3 	 0.05 	 100 	 0.58673388658 	 0.524853661187
quantile 	 0.5 	 4 	 0.05 	 100 	 0.619461519386 	 0.523341562462
quantile 	 0.5 	 5 	 0.05 	 100 	 0.630153104742 	 0.526933204881
quantile 	 0.5 	 6 	 0.05 	 100 	 0.637763280962 	 0.519793379464
quantile 	 0.5 	 7 	 0.05 	 100 	 0.649569346495 	 0.500441390607
quantile 	 0.5 	 8 	 0.05 	 100 	 0.662254206386 	 0.486503893056
quantile 	 0.5 	 9 	 0.05 	 100 	 0.664303147619 	 0.48152678482
quantile 	 0.75 	 1 	 0.05 	 100 	 0.53723249853 	 0.517791488697
quantile 	 0.75 	 2 	 0.05 	 100 	 0.568583975795 	 0.517445645223
quantile 	 0.75 	 3 	 0.05 	 100 	 0.585558293366 	 0.509988370252
quantile 	 0.75 	 4 	 0.05 	 100 	 0.598506842354 	 0.49850085886
quantile 	 0.75 	 5 	 0.05 	 100 	 0.618357956888 	 0.492355617663
quantile 	 0.75 	 6 	 0.05 	 100 	 0.622885058303 	 0.474975221008
quantile 	 0.75 	 7 	 0.05 	 100 	 0.656488202042 	 0.473372144055
quantile 	 0.75 	 8 	 0.05 	 100 	 0.660201407577 	 0.451091621879
quantile 	 0.75 	 9 	 0.05 	 100 	 0.672359070201 	 0.445650099879

Create interactive chart

Must rerun for interactive chart!

In [4]:
%%javascript
// Since I append the div later, sometimes there are multiple divs.
$("#container1").remove();

// Make the cdiv to contain the chart.
element.append('<div id="container1" style="min-width: 310px; height: 400px; margin: 0 auto"></div>');

// Require highcarts and make the chart.
require(['highcharts_exports'], function(Highcharts) {
    $('#container1').highcharts({
        title: {
        text: 'GradientBoostingRegressor'
    },
        plotOptions: {
            scatter: {
                dataLabels: {
                    format: "{point.name}",
                    enabled: true
                },
                
                enableMouseTracking: false
            }
        },
        
        yAxis: {
        title: {
            text: 'test'
        }
    },xAxis: {
        title: {
            text: 'train'
        }
    },
       
        legend: {
            enabled: false
        },
        series: [{name:'ls - LR0.1 - MD1',data:[[0.483195481072,0.474670631554]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.1 - MD2',data:[[0.573312148488,0.542484573263]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.1 - MD3',data:[[0.639864359097,0.572574474065]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.1 - MD4',data:[[0.711570113162,0.591042550157]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.1 - MD5',data:[[0.788853944671,0.597958416133]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.1 - MD6',data:[[0.864170521402,0.599622857668]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.1 - MD7',data:[[0.930424766073,0.596030158475]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.1 - MD8',data:[[0.968269131224,0.587936201096]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.1 - MD9',data:[[0.991099822719,0.57738851849]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.15 - MD1',data:[[0.519173600994,0.506530808563]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.15 - MD2',data:[[0.606877851635,0.564962613525]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.15 - MD3',data:[[0.678489135064,0.589166602604]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.15 - MD4',data:[[0.754201228068,0.59949633042]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.15 - MD5',data:[[0.830949703617,0.601950337646]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.15 - MD6',data:[[0.897615111022,0.5980906965]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.15 - MD7',data:[[0.955596525332,0.590033544289]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.15 - MD8',data:[[0.983457954212,0.579793165364]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.15 - MD9',data:[[0.997128038571,0.567213632333]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.2 - MD1',data:[[0.542924522414,0.526893958616]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.2 - MD2',data:[[0.629785050773,0.578424459603]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.2 - MD3',data:[[0.703825126206,0.595344980637]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.2 - MD4',data:[[0.78146733493,0.601375735172]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.2 - MD5',data:[[0.855945803152,0.595180303543]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.2 - MD6',data:[[0.922226987158,0.590351006137]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.2 - MD7',data:[[0.969728483677,0.580146372507]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.2 - MD8',data:[[0.990807379293,0.568193397591]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.2 - MD9',data:[[0.998564304007,0.552465567057]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.25 - MD1',data:[[0.559405555776,0.540754120087]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.25 - MD2',data:[[0.64650911693,0.586309813089]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.25 - MD3',data:[[0.720777611522,0.598429031394]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.25 - MD4',data:[[0.800040485961,0.59607520576]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.25 - MD5',data:[[0.877771541174,0.590470901025]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.25 - MD6',data:[[0.939282385401,0.578703560005]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.25 - MD7',data:[[0.977229088685,0.567272206962]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.25 - MD8',data:[[0.995125947476,0.554312828629]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.25 - MD9',data:[[0.99955848554,0.540051802073]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.5 - MD1',data:[[0.597612520156,0.563577973277]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.5 - MD2',data:[[0.681079881294,0.582230505545]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.5 - MD3',data:[[0.761461322512,0.57405556384]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.5 - MD4',data:[[0.849517402345,0.544959603289]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.5 - MD5',data:[[0.928608187651,0.514958440526]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.5 - MD6',data:[[0.977741120904,0.492438416477]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.5 - MD7',data:[[0.995730500572,0.474708972368]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.5 - MD8',data:[[0.999773201584,0.461162568518]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.5 - MD9',data:[[0.999991519654,0.445805099294]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.75 - MD1',data:[[0.606852019672,0.561684732504]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.75 - MD2',data:[[0.691586078645,0.56485422784]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.75 - MD3',data:[[0.778952150523,0.512579454393]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.75 - MD4',data:[[0.877017801221,0.462789396318]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.75 - MD5',data:[[0.954800912462,0.400535774188]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.75 - MD6',data:[[0.991349439505,0.359251139115]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.75 - MD7',data:[[0.999104622718,0.338249867878]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.75 - MD8',data:[[0.999969242334,0.320387508687]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'ls - LR0.75 - MD9',data:[[0.999999723331,0.309320133057]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'lad - LR0.1 - MD1',data:[[0.470147244234,0.463792116696]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.1 - MD2',data:[[0.552889121664,0.53014259963]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.1 - MD3',data:[[0.61240896779,0.56339443216]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.1 - MD4',data:[[0.660189367638,0.574401693833]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.1 - MD5',data:[[0.708718359382,0.581121909447]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.1 - MD6',data:[[0.753550867917,0.583106521626]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.1 - MD7',data:[[0.79126941866,0.577200902192]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.1 - MD8',data:[[0.827173658816,0.572023240758]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.1 - MD9',data:[[0.86302982277,0.567393604828]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.15 - MD1',data:[[0.50531548943,0.495426880447]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.15 - MD2',data:[[0.584173448482,0.553337009746]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.15 - MD3',data:[[0.638287211259,0.575925501813]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.15 - MD4',data:[[0.686871547205,0.583072692261]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.15 - MD5',data:[[0.731838475214,0.584479040752]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.15 - MD6',data:[[0.777482549972,0.583185647494]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.15 - MD7',data:[[0.815524920058,0.579326200172]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.15 - MD8',data:[[0.8474383006,0.571369892454]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.15 - MD9',data:[[0.875936489803,0.56192039182]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.2 - MD1',data:[[0.526634366283,0.513101736847]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.2 - MD2',data:[[0.601853945648,0.564377943386]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.2 - MD3',data:[[0.657019346919,0.583643442889]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.2 - MD4',data:[[0.699606843876,0.583923832298]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.2 - MD5',data:[[0.746909026349,0.583862672448]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.2 - MD6',data:[[0.784414301077,0.580506250823]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.2 - MD7',data:[[0.825115258714,0.575212067096]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.2 - MD8',data:[[0.855249257636,0.563140362391]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.2 - MD9',data:[[0.887213285797,0.555947943111]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.25 - MD1',data:[[0.544964643176,0.529631214942]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.25 - MD2',data:[[0.614969012229,0.571436672092]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.25 - MD3',data:[[0.666229813029,0.584058564796]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.25 - MD4',data:[[0.711360297902,0.581802582589]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.25 - MD5',data:[[0.752913444559,0.579061446238]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.25 - MD6',data:[[0.784434659618,0.5741186018]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.25 - MD7',data:[[0.824056919477,0.567506882236]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.25 - MD8',data:[[0.861304393256,0.556674638704]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.25 - MD9',data:[[0.886159913456,0.548212034485]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.5 - MD1',data:[[0.541070180523,0.520994250388]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.5 - MD2',data:[[0.647893425537,0.577629536759]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.5 - MD3',data:[[0.688345597803,0.57000271842]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.5 - MD4',data:[[0.733897399816,0.555695882275]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.5 - MD5',data:[[0.769537623812,0.547997992185]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.5 - MD6',data:[[0.807249667709,0.533532324904]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.5 - MD7',data:[[0.835757890632,0.518192532309]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.5 - MD8',data:[[0.87284285622,0.498597683566]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.5 - MD9',data:[[0.896917529415,0.484213472853]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.75 - MD1',data:[[0.596317283023,0.556183736385]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.75 - MD2',data:[[0.652990330282,0.563488121362]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.75 - MD3',data:[[0.698452223812,0.54571542435]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.75 - MD4',data:[[0.731143277028,0.512059059729]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.75 - MD5',data:[[0.782148045705,0.486256502673]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.75 - MD6',data:[[0.811386651861,0.45511620425]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.75 - MD7',data:[[0.843396315063,0.439166826033]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.75 - MD8',data:[[0.879474561023,0.412184112483]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'lad - LR0.75 - MD9',data:[[0.897729340138,0.390995879317]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'huber - LR0.1 - MD1',data:[[0.484702820443,0.476256697828]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.1 - MD2',data:[[0.572787279837,0.543573328207]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.1 - MD3',data:[[0.636429581541,0.572089154825]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.1 - MD4',data:[[0.705626481701,0.58987898467]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.1 - MD5',data:[[0.782201997196,0.597765022006]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.1 - MD6',data:[[0.852819556298,0.598969575974]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.1 - MD7',data:[[0.914158427534,0.595389577059]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.1 - MD8',data:[[0.955300038064,0.586869064995]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.1 - MD9',data:[[0.980411857223,0.577367233848]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.15 - MD1',data:[[0.519941627966,0.507897862138]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.15 - MD2',data:[[0.607327118468,0.566325242824]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.15 - MD3',data:[[0.676699207771,0.589211731403]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.15 - MD4',data:[[0.746618168939,0.59665550598]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.15 - MD5',data:[[0.821272306105,0.60081752492]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.15 - MD6',data:[[0.885438298961,0.59769842929]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.15 - MD7',data:[[0.937023503168,0.590278029409]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.15 - MD8',data:[[0.970426747868,0.581065009216]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.15 - MD9',data:[[0.986964960927,0.566564430631]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.2 - MD1',data:[[0.54256712556,0.528182412243]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.2 - MD2',data:[[0.628663643289,0.578246036712]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.2 - MD3',data:[[0.699652365591,0.596663826998]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.2 - MD4',data:[[0.770187654079,0.598169269974]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.2 - MD5',data:[[0.844262425768,0.596027464119]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.2 - MD6',data:[[0.905089969876,0.589218799269]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.2 - MD7',data:[[0.954889724486,0.579975597426]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.2 - MD8',data:[[0.979431370891,0.567294999257]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.2 - MD9',data:[[0.991727830882,0.554679291695]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.25 - MD1',data:[[0.559720657401,0.541083448514]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.25 - MD2',data:[[0.644148356014,0.585851345331]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.25 - MD3',data:[[0.717523746548,0.598188059469]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.25 - MD4',data:[[0.787801596543,0.595209847844]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.25 - MD5',data:[[0.863754625038,0.589298549984]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.25 - MD6',data:[[0.921083919934,0.579992048102]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.25 - MD7',data:[[0.965034691734,0.568629367988]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.25 - MD8',data:[[0.983509172086,0.556772124846]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.25 - MD9',data:[[0.993460904396,0.541221352857]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.5 - MD1',data:[[0.595676165738,0.564154299715]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.5 - MD2',data:[[0.678149425592,0.583144684586]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.5 - MD3',data:[[0.758631929306,0.575104151827]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.5 - MD4',data:[[0.835414885511,0.550314716196]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.5 - MD5',data:[[0.909606389212,0.524410034328]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.5 - MD6',data:[[0.963297462163,0.502234861715]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.5 - MD7',data:[[0.984285631928,0.484700504454]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.5 - MD8',data:[[0.992078885691,0.47157913527]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.5 - MD9',data:[[0.996951070307,0.45165522074]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.75 - MD1',data:[[0.606945820357,0.559035997392]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.75 - MD2',data:[[0.689569079414,0.558096043278]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.75 - MD3',data:[[0.774867188469,0.520734366421]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.75 - MD4',data:[[0.859730039591,0.46570304241]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.75 - MD5',data:[[0.936316030797,0.410348470649]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.75 - MD6',data:[[0.977501177396,0.381674699951]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.75 - MD7',data:[[0.987516730978,0.365960005809]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.75 - MD8',data:[[0.992953428422,0.344039423157]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'huber - LR0.75 - MD9',data:[[0.997207734394,0.320933042798]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'quantile - LR0.1 - MD1',data:[[0.423238318883,0.424403863531]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.1 - MD2',data:[[0.497303156164,0.484234218165]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.1 - MD3',data:[[0.533069482006,0.507447726265]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.1 - MD4',data:[[0.55817468672,0.516466130494]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.1 - MD5',data:[[0.57573691731,0.517066136644]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.1 - MD6',data:[[0.587564188439,0.512729279197]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.1 - MD7',data:[[0.593097890283,0.501817690723]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.1 - MD8',data:[[0.610329728067,0.500073852524]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.1 - MD9',data:[[0.608361601864,0.487795052256]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.15 - MD1',data:[[0.45620187398,0.454324648218]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.15 - MD2',data:[[0.520808531602,0.502712682075]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.15 - MD3',data:[[0.556432890711,0.52328556955]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.15 - MD4',data:[[0.575624901586,0.524978122751]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.15 - MD5',data:[[0.589029428191,0.52458893886]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.15 - MD6',data:[[0.601383893006,0.517931405175]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.15 - MD7',data:[[0.620786448514,0.515524550746]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.15 - MD8',data:[[0.623015806904,0.502629595446]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.15 - MD9',data:[[0.62340850872,0.491110477043]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.2 - MD1',data:[[0.476672151201,0.471948670592]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.2 - MD2',data:[[0.537850554478,0.514182800628]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.2 - MD3',data:[[0.561246711192,0.524261876705]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.2 - MD4',data:[[0.593101129083,0.533834121163]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.2 - MD5',data:[[0.605721657923,0.530259911835]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.2 - MD6',data:[[0.616357172131,0.524147428767]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.2 - MD7',data:[[0.627130242245,0.516334188837]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.2 - MD8',data:[[0.64259263376,0.510422616685]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.2 - MD9',data:[[0.645490438547,0.501146966844]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.25 - MD1',data:[[0.491012773207,0.485374274742]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.25 - MD2',data:[[0.54531751435,0.519866482481]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.25 - MD3',data:[[0.575786948041,0.532538708395]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.25 - MD4',data:[[0.597126659958,0.530261221923]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.25 - MD5',data:[[0.605907437887,0.528458734123]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.25 - MD6',data:[[0.628726805734,0.52590127478]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.25 - MD7',data:[[0.639785679312,0.519928756854]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.25 - MD8',data:[[0.638370522989,0.504382082233]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.25 - MD9',data:[[0.654651683521,0.498127846258]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.5 - MD1',data:[[0.532728180733,0.517173859921]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.5 - MD2',data:[[0.55852769516,0.522711256208]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.5 - MD3',data:[[0.58673388658,0.524853661187]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.5 - MD4',data:[[0.619461519386,0.523341562462]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.5 - MD5',data:[[0.630153104742,0.526933204881]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.5 - MD6',data:[[0.637763280962,0.519793379464]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.5 - MD7',data:[[0.649569346495,0.500441390607]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.5 - MD8',data:[[0.662254206386,0.486503893056]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.5 - MD9',data:[[0.664303147619,0.48152678482]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.75 - MD1',data:[[0.53723249853,0.517791488697]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.75 - MD2',data:[[0.568583975795,0.517445645223]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.75 - MD3',data:[[0.585558293366,0.509988370252]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.75 - MD4',data:[[0.598506842354,0.49850085886]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.75 - MD5',data:[[0.618357956888,0.492355617663]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.75 - MD6',data:[[0.622885058303,0.474975221008]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.75 - MD7',data:[[0.656488202042,0.473372144055]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.75 - MD8',data:[[0.660201407577,0.451091621879]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'quantile - LR0.75 - MD9',data:[[0.672359070201,0.445650099879]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}}]
    });
});

A lot of different parameters give the same test score.

In [29]:
reg = GradientBoostingRegressor()
reg.fit (XTrain, np.ravel(y_train))
y_pred_train = reg.predict(XTrain)
y_pred_test = reg.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
      metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
0.05 	 100 	 0.639864359097 	 0.572575631533

Test different parameters for MLPRegressor

In [30]:
scalery = StandardScaler()
scalery.fit(y_train)
activation_list= ["identity","logistic","tanh","relu"]
solver_list = ['lbfgs','sgd','adam']
for activation_i in activation_list:
    for solver_i in solver_list:
        reg = MLPRegressor( activation=activation_i,solver=solver_i)
        reg.fit (XTrain, np.ravel(scalery.transform(y_train)))
        y_pred_train = scalery.inverse_transform(reg.predict(XTrain))
        y_pred_test = scalery.inverse_transform(reg.predict(XTest))
        print(activation_i,"\t",solver_i,"\t",train_size,"\t",pca_i, "\t",
              metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
              metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
identity 	 lbfgs 	 0.05 	 100 	 0.607154096275 	 0.600211284445
identity 	 sgd 	 0.05 	 100 	 0.606660310127 	 0.599485790112
identity 	 adam 	 0.05 	 100 	 0.570139742024 	 0.564111497658
logistic 	 lbfgs 	 0.05 	 100 	 0.909963398761 	 0.459365415587
logistic 	 sgd 	 0.05 	 100 	 0.610245249329 	 0.60308284211
C:\anaconda\lib\site-packages\sklearn\neural_network\multilayer_perceptron.py:564: ConvergenceWarning: Stochastic Optimizer: Maximum iterations (200) reached and the optimization hasn't converged yet.
  % self.max_iter, ConvergenceWarning)
logistic 	 adam 	 0.05 	 100 	 0.981950120395 	 0.400868557658
tanh 	 lbfgs 	 0.05 	 100 	 0.990816806993 	 0.0976663334808
tanh 	 sgd 	 0.05 	 100 	 0.6966214225 	 0.58945632735
tanh 	 adam 	 0.05 	 100 	 0.991294687258 	 0.159512907298
relu 	 lbfgs 	 0.05 	 100 	 0.937123552089 	 0.367925087958
relu 	 sgd 	 0.05 	 100 	 0.732436830781 	 0.604889248315
relu 	 adam 	 0.05 	 100 	 0.83767932459 	 0.557892075499

Create interactive chart

Must rerun for interactive chart!

In [5]:
%%javascript
// Since I append the div later, sometimes there are multiple divs.
$("#container2").remove();

// Make the cdiv to contain the chart.
element.append('<div id="container2" style="min-width: 310px; height: 400px; margin: 0 auto"></div>');

// Require highcarts and make the chart.
require(['highcharts_exports'], function(Highcharts) {
    $('#container2').highcharts({
         title: {
        text: 'MLPRegressor'
    },
        plotOptions: {
            scatter: {
                dataLabels: {
                    format: "{point.name}",
                    enabled: true
                },
                
                enableMouseTracking: false
            }
        },
        
        yAxis: {
        title: {
            text: 'test'
        }
    },xAxis: {
        title: {
            text: 'train'
        }
    },
       
        legend: {
            enabled: false
        },
        series: [{name:'identity - lbfgs ',data:[[0.607154096275,0.600211284445]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'identity - sgd ',data:[[0.606660310127,0.599485790112]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'identity - adam ',data:[[0.570139742024,0.564111497658]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'logistic - lbfgs ',data:[[0.909963398761,0.459365415587]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'logistic - sgd ',data:[[0.610245249329,0.60308284211]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'logistic - adam ',data:[[0.981950120395,0.400868557658]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'tanh - lbfgs ',data:[[0.990816806993,0.0976663334808]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'tanh - sgd ',data:[[0.6966214225,0.58945632735]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'tanh - adam ',data:[[0.991294687258,0.159512907298]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'relu - lbfgs ',data:[[0.937123552089,0.367925087958]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'relu - sgd ',data:[[0.732436830781,0.604889248315]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'relu - adam ',data:[[0.83767932459,0.557892075499]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}}]
    });
});

sgd seems to do the best.

In [31]:
scalery = StandardScaler()
scalery.fit(y_train)
reg = MLPRegressor( )
reg.fit (XTrain, np.ravel(scalery.transform(y_train)))
y_pred_train = scalery.inverse_transform(reg.predict(XTrain))
y_pred_test = scalery.inverse_transform(reg.predict(XTest))
print(train_size,"\t",pca_i, "\t",
      metrics.explained_variance_score(y_true = y_train, y_pred = y_pred_train),"\t",
      metrics.explained_variance_score(y_true = y_test, y_pred = y_pred_test))
0.05 	 100 	 0.885264661927 	 0.514486819843

Custom Neural Network Points

In [38]:
from sklearn.preprocessing import MinMaxScaler
def return_model_data_points(train_size, pca_i, input_df, keep_vars, save_tools = False):
    taster_df = pd.get_dummies(input_df[['taster']])
    taster_df = taster_df.drop(taster_df.columns[0], axis=1)
    taster_df = taster_df.reset_index(drop=True)
    Category_df = pd.get_dummies(input_df[['Category']])
    Category_df = Category_df.drop(Category_df.columns[0], axis=1)
    Category_df = Category_df.reset_index(drop=True)
    l5_df = pd.get_dummies(input_df[['l5']])
    l5_df = l5_df.drop(l5_df.columns[0], axis=1)
    l5_df = l5_df.reset_index(drop=True)
    
    dummy_df = pd.concat([taster_df,Category_df,l5_df], axis=1)


    
    train_test_split_output = train_test_split(input_df, dummy_df, input_df[['points']] , random_state=1, test_size=1-train_size)

    df_train, df_test, dummy_df_train, dummy_df_test, y1_train, y1_test = train_test_split_output

    scaler_points = MinMaxScaler(feature_range=(0.01, .99))
    scaler_points.fit(y1_train)
    if save_tools:
        joblib.dump(scaler_points, 'scaler_points.pkl')
    y1_train_s = scaler_points.transform(y1_train)
    y1_test_s = scaler_points.transform(y1_test)
    
    
    vectorizer = TfidfVectorizer()
    #vectorizer = TfidfVectorizer(stop_words=stopwords.words('english'))
    vectorizer.fit(df_train.description.tolist())
    if save_tools:
        joblib.dump(vectorizer, 'vectorizer.pkl')

    Tfidf_df_train = vectorizer.transform(df_train.description.tolist())
    Tfidf_df_test = vectorizer.transform(df_test.description.tolist())

    input_df_train = df_train.reset_index(drop=True).copy(deep=True)
    input_df_test = df_test.reset_index(drop=True).copy(deep=True)

    my_normalizer1 = Normalizer()
    my_normalizer1.fit(Tfidf_df_train)
    if save_tools:
        joblib.dump(my_normalizer1, 'normalizer1.pkl')
        
    Tfidf_df_train = my_normalizer1.transform(Tfidf_df_train)
    Tfidf_df_test = my_normalizer1.transform(Tfidf_df_test)

    svd1 = TruncatedSVD(n_components=pca_i, n_iter=7, random_state=42)
    svd1.fit(Tfidf_df_train)
    if save_tools:
        joblib.dump(svd1, 'svd1.pkl')
        
    text_df_train = pd.DataFrame(svd1.transform(Tfidf_df_train))
    text_df_train = text_df_train.reset_index(drop=True)
    text_df_test = pd.DataFrame(svd1.transform(Tfidf_df_test))
    text_df_test = text_df_test.reset_index(drop=True)

    input_df_train = input_df_train.reset_index(drop=True)
    input_df_test = input_df_test.reset_index(drop=True)
    dummy_df_train = dummy_df_train.reset_index(drop=True)
    dummy_df_test = dummy_df_test.reset_index(drop=True)
    text_df_train = text_df_train.reset_index(drop=True)
    text_df_test = text_df_test.reset_index(drop=True)
    
    final_input_train = pd.concat([input_df_train[keep_vars],dummy_df_train,text_df_train], axis=1)
    final_input_test = pd.concat([input_df_test[keep_vars], dummy_df_test,text_df_test], axis=1)

    scaler = StandardScaler()
    scaler.fit(final_input_train)
    if save_tools:
        joblib.dump(scaler, 'scaler.pkl')

    XTrain = scaler.transform(final_input_train)
    XTest = scaler.transform(final_input_test)

    return(XTrain, XTest,  
           y1_train, y1_test, 
           y1_train_s, y1_test_s,
           vectorizer,my_normalizer1,svd1,scaler,scaler_points)
In [42]:
class neuralNetwork:
    def __init__(self, input_nodes, hidden_nodes, output_nodes, learning_rate,
                 weights_input_to_hidden,weights_hidden_to_output,scaler_points):
        self.input_nodes = input_nodes
        self.hidden_nodes = hidden_nodes
        self.output_nodes = output_nodes
        self.weights_input_to_hidden = weights_input_to_hidden
        self.weights_hidden_to_output = weights_hidden_to_output
        self.learning_rate = learning_rate
        self.y1_n = 1
        self.e = 0
        self.scaler_points = scaler_points
        pass

    
    def activation_function(self,x):
        return sc.special.expit(x)
    
    def get_e(self):
        return self.e
    
    def get_lookup_Tables(self):
        return (self.lookupTable1, self.lookupTable2, self.lookupTable3)
    
    def train(self, inputs_list, targets_list):
        inputs = np.array(inputs_list, ndmin=2).T
        targets = np.array(targets_list, ndmin=2).T
        hidden_inputs = np.dot(self.weights_input_to_hidden, inputs)
        hidden_outputs = self.activation_function(hidden_inputs)
        final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs)
        final_outputs = self.activation_function(final_inputs)
        output_errors = targets - final_outputs
        hidden_errors = np.dot(self.weights_hidden_to_output.T, output_errors)
        self.weights_hidden_to_output += self.learning_rate * np.dot(
            (output_errors * final_outputs * (1.0 - final_outputs)), np.transpose(hidden_outputs))
        self.weights_input_to_hidden += self.learning_rate * np.dot(
            (hidden_errors * hidden_outputs * (1.0 - hidden_outputs)), np.transpose(inputs))
        self.e += 1
        pass

    def train_df(self, XTrain, y1_train):
        for x, y1 in zip(XTrain, y1_train):
            inputs = np.asfarray(x)
            self.train(inputs, y1)
            pass
        pass

    def query(self, inputs_list):
        inputs = np.array(inputs_list, ndmin=2).T
        hidden_inputs = np.dot(self.weights_input_to_hidden, inputs)
        hidden_outputs = self.activation_function(hidden_inputs)
        final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs)
        final_outputs = self.activation_function(final_inputs)
        return final_outputs


    def get_accuracy(self, XTrain, y1_train, y1_train_s):

        count_total = 0
        y_1_prediction = []
        for x in XTrain:
            inputs = np.asfarray(x)
            predicted_value = self.query(inputs)
 
            y_1_prediction.append(self.scaler_points.inverse_transform(predicted_value[0][0])[0][0])
            
            count_total += 1

        print(self.hidden_nodes, self.e, count_total,
              metrics.explained_variance_score(y_true = y1_train, y_pred = y_1_prediction), sep='\t')
        return (self.hidden_nodes, self.e, count_total)
    
In [43]:
train_size=.2
pca_i=100
output_data = return_model_data_points(train_size=train_size, pca_i=pca_i, input_df=df2, keep_vars=['price_per_liter_clip'], save_tools = True)
XTrain, XTest,  y1_train, y1_test,y1_train_s, y1_test_s,  vectorizer, normalizer1, svd1, scaler,scaler_points = output_data
In [46]:
y1_n= 1

# number of input, hidden and output nodes
input_nodes = XTrain.shape[1]
hidden_nodes = 100
output_nodes = 1

# learning rate
learning_rate = 0.001
weights_input_to_hidden = np.random.normal(0.0, pow(input_nodes, -0.5),
                                                        (hidden_nodes, input_nodes))
weights_hidden_to_output = np.random.normal(0.0, pow(hidden_nodes, -0.5),
                                                         (output_nodes, hidden_nodes))

n = neuralNetwork(input_nodes, hidden_nodes, output_nodes, learning_rate, 
                 weights_input_to_hidden=weights_input_to_hidden,
                  weights_hidden_to_output=weights_hidden_to_output,scaler_points=scaler_points)
In [47]:
epochs = 50
for e in range(epochs):
    XTrain, y1_train, y1_train_s = shuffle(XTrain, y1_train,y1_train_s)
    n.train_df(XTrain, y1_train_s)
    n.get_accuracy(XTrain,  y1_train, y1_train_s)
    n.get_accuracy(XTest,  y1_test, y1_test_s)
    pass
100	41979	41979	0.476865355386
100	41979	167916	0.477218923812
100	83958	41979	0.589641039695
100	83958	167916	0.588546596474
100	125937	41979	0.617138503993
100	125937	167916	0.615377340686
100	167916	41979	0.62578331654
100	167916	167916	0.623662834873
100	209895	41979	0.62889553909
100	209895	167916	0.626558528971
100	251874	41979	0.63042431661
100	251874	167916	0.627900952914
100	293853	41979	0.631436233198
100	293853	167916	0.628763146679
100	335832	41979	0.632202210426
100	335832	167916	0.629364831414
100	377811	41979	0.632301557825
100	377811	167916	0.629294840102
100	419790	41979	0.632978913559
100	419790	167916	0.629877691744
100	461769	41979	0.633290202922
100	461769	167916	0.630081236283
100	503748	41979	0.633522091636
100	503748	167916	0.630190907484
100	545727	41979	0.633890353599
100	545727	167916	0.630476349082
100	587706	41979	0.634001673426
100	587706	167916	0.630506819694
100	629685	41979	0.634259329029
100	629685	167916	0.630678026042
100	671664	41979	0.634387739692
100	671664	167916	0.630740908153
100	713643	41979	0.634742820067
100	713643	167916	0.631030774431
100	755622	41979	0.634700836538
100	755622	167916	0.630925040418
100	797601	41979	0.635132375443
100	797601	167916	0.631310940672
100	839580	41979	0.635351636091
100	839580	167916	0.631412283026
100	881559	41979	0.635435090832
100	881559	167916	0.631496069688
100	923538	41979	0.635456373286
100	923538	167916	0.631379350651
100	965517	41979	0.635708390833
100	965517	167916	0.631586816922
100	1007496	41979	0.63603526151
100	1007496	167916	0.631856187121
100	1049475	41979	0.636340811359
100	1049475	167916	0.632088499988
100	1091454	41979	0.636227860658
100	1091454	167916	0.63194530438
100	1133433	41979	0.636683093283
100	1133433	167916	0.632286977968
100	1175412	41979	0.63672849179
100	1175412	167916	0.632285703957
100	1217391	41979	0.636807487555
100	1217391	167916	0.632322461282
100	1259370	41979	0.636876685682
100	1259370	167916	0.632344876871
100	1301349	41979	0.637056258573
100	1301349	167916	0.632448006455
100	1343328	41979	0.637548963569
100	1343328	167916	0.632840976959
100	1385307	41979	0.637481745302
100	1385307	167916	0.632699481775
100	1427286	41979	0.637481205413
100	1427286	167916	0.632657981578
100	1469265	41979	0.637640253374
100	1469265	167916	0.632764898196
100	1511244	41979	0.637976723973
100	1511244	167916	0.633070154704
100	1553223	41979	0.638261978346
100	1553223	167916	0.633258811183
100	1595202	41979	0.638203933835
100	1595202	167916	0.63316757594
100	1637181	41979	0.638381842658
100	1637181	167916	0.633226762028
100	1679160	41979	0.638929517814
100	1679160	167916	0.633721312942
100	1721139	41979	0.638581784362
100	1721139	167916	0.633333598648
100	1763118	41979	0.638702204205
100	1763118	167916	0.633416021884
100	1805097	41979	0.639263863602
100	1805097	167916	0.63388720947
100	1847076	41979	0.639259833497
100	1847076	167916	0.633873927296
100	1889055	41979	0.639383928788
100	1889055	167916	0.63388708343
100	1931034	41979	0.639388729534
100	1931034	167916	0.633856365033
100	1973013	41979	0.639521024905
100	1973013	167916	0.633950948046
100	2014992	41979	0.640012491852
100	2014992	167916	0.634337398002
100	2056971	41979	0.639964323043
100	2056971	167916	0.634215570946
100	2098950	41979	0.640104692876
100	2098950	167916	0.634340015271

Create interactive chart

Must rerun for interactive chart!

In [6]:
%%javascript
// Since I append the div later, sometimes there are multiple divs.
$("#containernnp").remove();

// Make the cdiv to contain the chart.
element.append('<div id="containernnp" style="min-width: 310px; height: 400px; margin: 0 auto"></div>');

// Require highcarts and make the chart.
require(['highcharts_exports'], function(Highcharts) {
    $('#containernnp').highcharts({
         title: {
        text: 'Custom Neural Network Points'
    },
        plotOptions: {
            scatter: {
                dataLabels: {
                    format: "{point.name}",
                    enabled: true
                },
                
                enableMouseTracking: false
            }
        },
        
        yAxis: {
        title: {
            text: 'test'
        }
    },xAxis: {
        title: {
            text: 'train'
        }
    },
       
        legend: {
            enabled: false
        },
        series: [{name:'epoch 1',data:[[0.476865355386,0.477218923812]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 2',data:[[0.589641039695,0.588546596474]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 3',data:[[0.617138503993,0.615377340686]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 4',data:[[0.62578331654,0.623662834873]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 5',data:[[0.62889553909,0.626558528971]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 6',data:[[0.63042431661,0.627900952914]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 7',data:[[0.631436233198,0.628763146679]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 8',data:[[0.632202210426,0.629364831414]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 9',data:[[0.632301557825,0.629294840102]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 10',data:[[0.632978913559,0.629877691744]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 11',data:[[0.633290202922,0.630081236283]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 12',data:[[0.633522091636,0.630190907484]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 13',data:[[0.633890353599,0.630476349082]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 14',data:[[0.634001673426,0.630506819694]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 15',data:[[0.634259329029,0.630678026042]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 16',data:[[0.634387739692,0.630740908153]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 17',data:[[0.634742820067,0.631030774431]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 18',data:[[0.634700836538,0.630925040418]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 19',data:[[0.635132375443,0.631310940672]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 20',data:[[0.635351636091,0.631412283026]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 21',data:[[0.635435090832,0.631496069688]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 22',data:[[0.635456373286,0.631379350651]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 23',data:[[0.635708390833,0.631586816922]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 24',data:[[0.63603526151,0.631856187121]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 25',data:[[0.636340811359,0.632088499988]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 26',data:[[0.636227860658,0.63194530438]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 27',data:[[0.636683093283,0.632286977968]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 28',data:[[0.63672849179,0.632285703957]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 29',data:[[0.636807487555,0.632322461282]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 30',data:[[0.636876685682,0.632344876871]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 31',data:[[0.637056258573,0.632448006455]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 32',data:[[0.637548963569,0.632840976959]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 33',data:[[0.637481745302,0.632699481775]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 34',data:[[0.637481205413,0.632657981578]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 35',data:[[0.637640253374,0.632764898196]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 36',data:[[0.637976723973,0.633070154704]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 37',data:[[0.638261978346,0.633258811183]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 38',data:[[0.638203933835,0.63316757594]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 39',data:[[0.638381842658,0.633226762028]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 40',data:[[0.638929517814,0.633721312942]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 41',data:[[0.638581784362,0.633333598648]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 42',data:[[0.638702204205,0.633416021884]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 43',data:[[0.639263863602,0.63388720947]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 44',data:[[0.639259833497,0.633873927296]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 45',data:[[0.639383928788,0.63388708343]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 46',data:[[0.639388729534,0.633856365033]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 47',data:[[0.639521024905,0.633950948046]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 48',data:[[0.640012491852,0.634337398002]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 49',data:[[0.639964323043,0.634215570946]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}},
{name:'epoch 50',data:[[0.640104692876,0.634340015271]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'circle'}}]
    });
});

Only the SVM does better than this custom neural network

Analysis Classification Methodology

Predict the Country, Category, and Taster from the points, price, and review. The following methods have been tried: MLP, Bagging, AdaBoostr, DecisionTree, RandomForest, ExtraTrees, GradientBoosting, C-Support Vector Classification, KNeighbors, and VotingEnsemble.

Run a neural network backwards to discover the terms per counrty, category, and taster that matter.

In [26]:
def return_model_data(train_size, pca_i, input_df, keep_vars, save_tools = False):
    lookupTable1, indexed_1 = np.unique(input_df[['taster']],   return_inverse=True)
    lookupTable2, indexed_2 = np.unique(input_df[['Category']], return_inverse=True)
    lookupTable3, indexed_3 = np.unique(input_df[['l5']],       return_inverse=True)

    train_test_split_output = train_test_split(input_df, indexed_1,indexed_2,indexed_3, random_state=1, test_size=1-train_size)

    df_train, df_test, y1_train, y1_test, y2_train, y2_test, y3_train, y3_test = train_test_split_output

    vectorizer = TfidfVectorizer()
    #vectorizer = TfidfVectorizer(stop_words=stopwords.words('english'))
    vectorizer.fit(df_train.description.tolist())
    if save_tools:
        joblib.dump(vectorizer, 'vectorizer.pkl')

    Tfidf_df_train = vectorizer.transform(df_train.description.tolist())
    Tfidf_df_test = vectorizer.transform(df_test.description.tolist())

    input_df_train = df_train.reset_index(drop=True).copy(deep=True)
    input_df_test = df_test.reset_index(drop=True).copy(deep=True)

    my_normalizer1 = Normalizer()
    my_normalizer1.fit(Tfidf_df_train)
    if save_tools:
        joblib.dump(normalizer1, 'normalizer1.pkl')

    Tfidf_df_train = my_normalizer1.transform(Tfidf_df_train)
    Tfidf_df_test = my_normalizer1.transform(Tfidf_df_test)

    svd1 = TruncatedSVD(n_components=pca_i, n_iter=7, random_state=42)
    svd1.fit(Tfidf_df_train)
    if save_tools:
        joblib.dump(svd1, 'svd1.pkl')

    text_df_train = pd.DataFrame(svd1.transform(Tfidf_df_train))
    text_df_train = text_df_train.reset_index(drop=True)
    text_df_test = pd.DataFrame(svd1.transform(Tfidf_df_test))
    text_df_test = text_df_test.reset_index(drop=True)

    final_input_train = pd.concat([input_df_train[keep_vars],text_df_train], axis=1)
    final_input_test = pd.concat([input_df_test[keep_vars],text_df_test], axis=1)

    scaler = StandardScaler()
    scaler.fit(final_input_train)
    if save_tools:
        joblib.dump(scaler, 'scaler.pkl')

    XTrain = scaler.transform(final_input_train)
    XTest = scaler.transform(final_input_test)

    return(XTrain, XTest, 
           y1_train, y1_test, 
           y2_train, y2_test, 
           y3_train, y3_test,
           lookupTable1,lookupTable2,lookupTable3,

           vectorizer,my_normalizer1,svd1,scaler)
In [86]:
from IPython.core.display import HTML
train_size=.01
pca_i=50
output_data = return_model_data(train_size=train_size, pca_i=pca_i, input_df=df2, keep_vars=['points','price_per_liter'])
XTrain, XTest, y1_train, y1_test, y2_train, y2_test, y3_train, y3_test,lookupTable1,lookupTable2,lookupTable3,vectorizer,normalizer1,svd1,scaler = output_data

MLPClassifier

In [90]:
mlpc = MLPClassifier(solver='lbfgs', alpha=1e-5, random_state=1)
mlpc.fit(XTrain, y1_train)  
y_pred_train = mlpc.predict(XTrain)
y_pred_test = mlpc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y1_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y1_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y1_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 1.0 	 0.69877332204
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
0 41 22 16 0 4 0 4 9 45 28 46 59 104 20 61 12 10 0 26 0 91
1 76 3515 76 0 28 0 16 19 240 48 117 111 225 27 491 131 31 0 55 38 229
2 5 83 2360 0 47 0 2 40 98 50 49 197 183 5 213 114 197 0 10 15 120
3 21 18 1 0 0 0 2 1 1 3 17 5 50 6 13 1 2 0 5 8 14
4 7 10 23 0 10 0 0 7 61 10 24 18 46 9 50 16 12 0 4 2 28
5 12 5 0 0 0 0 0 0 1 5 7 6 8 3 3 1 1 0 2 1 0
6 6 23 6 0 2 0 41 8 12 31 27 47 57 40 99 46 9 0 52 14 17
7 4 28 66 0 1 0 16 2849 136 5 34 28 158 7 720 181 168 0 46 29 175
8 40 367 220 0 74 0 2 192 5223 61 164 108 1610 21 2585 1079 210 0 291 109 422
9 63 76 12 0 18 0 5 26 34 10366 84 95 134 3 446 22 39 0 59 11 52
10 56 42 40 0 2 0 3 21 261 92 703 23 552 29 460 89 39 0 31 76 83
11 8 63 167 0 50 0 9 9 45 35 21 5812 283 17 268 83 44 0 39 24 126
12 65 162 96 0 31 0 2 136 513 76 345 203 22138 22 2672 780 166 0 210 211 162
13 8 44 1 0 2 0 20 3 18 28 9 70 28 147 117 33 2 0 28 5 27
14 171 495 381 0 76 0 10 818 2491 496 458 600 4875 78 45172 2900 1577 0 498 513 1478
15 9 184 176 0 35 0 10 172 705 25 94 197 1128 5 3368 9435 964 0 246 48 808
16 2 25 113 0 15 0 6 179 185 12 76 7 88 3 1117 325 27424 0 12 85 112
17 0 0 0 0 0 0 0 5 9 1 1 2 8 1 94 6 2 0 0 0 4
18 11 87 32 0 6 0 1 51 273 26 21 156 380 8 721 397 77 0 3028 16 136
19 17 15 18 0 15 0 9 27 60 49 114 80 312 41 476 26 106 0 31 988 49
20 40 309 136 0 25 0 40 226 347 80 62 300 391 69 1951 804 250 0 100 26 5951
In [91]:
mlpc = MLPClassifier(solver='lbfgs', alpha=1e-5, random_state=1)
mlpc.fit(XTrain, y2_train)  
y_pred_train = mlpc.predict(XTrain)
y_pred_test = mlpc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y2_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y2_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y2_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 1.0 	 0.858154833804
0 1 2 3 4 5 6
0 908 0 62 444 33 240 1117
1 14 0 2 36 2 15 40
2 73 0 59 851 24 76 280
3 312 0 279 118959 1904 817 4255
4 41 0 6 2283 1208 419 1904
5 191 0 130 1136 290 1851 5344
6 801 0 334 2532 670 2518 55337
In [92]:
mlpc = MLPClassifier(solver='lbfgs', alpha=1e-5, random_state=1)
mlpc.fit(XTrain, y3_train)  
y_pred_train = mlpc.predict(XTrain)
y_pred_test = mlpc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y3_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y3_test, y_pred = y_pred_test))

display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y3_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 1.0 	 0.582068076055
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 0 1448 0 175 17 0 0 0 4 1818 0 0 0 0 3 129 0 15 3 1 1 13 332 0 0 0 0 0 0 29 1 0 0 14 0 23 0 0 0 0 44 0 2316 0 11 634 0 0 2
2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 1 0 0 0
3 0 200 0 1504 110 0 0 0 12 450 0 3 0 0 16 566 0 132 13 12 0 16 669 0 0 0 0 0 0 15 5 0 0 382 0 110 0 0 0 0 128 0 473 1 28 2344 0 0 7
4 0 19 0 112 672 0 0 0 8 25 0 0 0 0 32 1631 0 90 5 4 2 2 295 0 0 0 0 0 0 0 4 0 0 77 0 319 0 0 0 0 44 0 87 0 8 803 0 0 1
5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
6 0 12 0 0 0 0 0 0 0 20 0 0 0 0 0 2 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 31 0 0 6 0 0 0
7 0 2 0 3 1 0 0 0 0 5 0 0 0 0 1 7 0 2 1 0 0 7 36 0 0 0 0 0 0 3 5 0 0 2 0 3 0 0 0 0 9 0 8 0 10 83 0 0 0
8 0 6 0 7 5 0 0 0 1 13 0 0 0 0 0 28 0 20 1 5 0 1 31 0 0 0 0 0 0 0 0 0 0 11 0 3 0 0 0 0 7 0 13 0 1 186 0 0 0
9 0 1279 0 217 36 0 0 0 3 2430 0 0 0 0 14 147 0 20 10 3 0 26 428 0 0 0 0 0 0 22 1 0 0 42 0 25 0 0 0 0 83 0 2369 1 14 819 0 0 12
10 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 4 0 0 0
11 0 3 0 1 2 0 0 0 0 2 0 0 0 0 0 8 0 4 2 1 0 4 16 0 0 0 0 0 0 1 0 0 0 2 0 1 0 0 0 0 6 0 2 0 4 49 0 0 0
12 0 1 0 0 0 0 0 0 1 2 0 0 0 0 0 2 0 0 0 0 0 1 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 6 0 0 0
13 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 2 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 6 0 0 0
14 0 0 0 1 22 0 0 0 0 0 0 0 0 0 5 17 0 1 1 1 0 0 4 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 11 0 0 0
15 0 158 0 826 889 0 0 0 20 326 0 0 0 0 49 16125 0 253 15 34 1 28 1128 0 0 0 0 0 0 11 4 0 0 282 0 3609 0 0 0 0 335 0 458 2 44 3334 0 0 9
16 0 0 0 1 3 0 0 0 0 0 0 0 0 0 0 5 0 3 1 0 0 5 30 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 10 0 4 0 5 37 0 0 0
17 0 28 0 162 52 0 0 0 9 45 0 0 0 0 29 348 0 1390 3 13 0 7 165 0 0 0 0 0 0 2 8 0 0 133 0 41 0 0 0 0 68 0 187 0 4 1117 0 0 2
18 0 15 0 12 16 0 0 0 5 27 0 2 0 0 0 68 0 7 23 16 5 5 193 0 0 0 0 0 0 3 3 0 0 7 0 13 0 0 0 0 77 0 38 12 13 305 0 0 6
19 0 5 0 2 9 0 0 0 3 6 0 1 0 0 0 31 0 10 1 8 1 1 34 0 0 0 0 0 0 2 3 0 0 3 0 3 0 0 0 0 8 0 12 2 4 87 0 0 1
20 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 2 2 0 0 0
21 0 12 0 42 3 0 0 0 1 25 0 0 0 0 3 35 0 13 3 4 1 39 140 0 0 0 0 0 0 7 7 0 0 9 0 4 0 0 0 0 63 0 32 1 67 251 0 0 0
22 0 406 0 508 137 0 0 0 34 510 0 0 0 0 13 861 0 123 20 24 1 188 18855 0 0 0 0 0 0 13 13 0 0 193 0 163 0 0 0 0 214 0 666 4 50 3836 0 0 14
23 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0
24 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 1 1 0 0 0
25 0 2 0 0 2 0 0 0 0 0 0 0 0 0 0 3 0 1 0 0 0 1 9 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 1 1 0 30 0 0 0
26 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 2 0 0 0
27 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 5 0 0 0
28 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 1 0 0 1 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 8 0 0 0
29 0 14 0 1 1 0 0 0 0 31 0 0 0 0 0 1 0 0 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 33 0 0 6 0 0 0
30 0 6 0 0 1 0 0 0 0 4 0 0 0 0 0 5 0 1 0 0 0 1 11 0 0 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 7 0 6 1 4 40 0 0 1
31 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
32 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 6 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 2 15 0 0 0
33 0 50 0 655 64 0 0 0 6 181 0 0 0 0 22 279 0 110 4 5 0 11 250 0 0 0 0 0 0 1 1 0 0 318 0 31 0 0 0 0 41 0 135 0 2 1013 0 0 1
34 0 5 0 0 0 0 0 0 0 7 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 2 0 0 0
35 0 15 0 98 121 0 0 0 3 45 0 1 0 0 10 4373 0 33 1 1 1 2 133 0 0 0 0 0 0 0 1 0 0 13 0 1794 0 0 0 0 49 0 44 1 2 568 0 0 2
36 0 2 0 3 2 0 0 0 0 2 0 0 0 0 2 14 0 4 1 1 1 1 44 0 0 0 0 0 0 1 1 0 0 2 0 0 0 0 0 0 9 0 6 0 4 65 0 0 1
37 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 2 0 0 4 0 0 0
38 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
39 0 1 0 2 2 0 0 0 0 0 0 0 0 0 1 5 0 8 0 0 0 1 13 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 4 0 1 0 4 47 0 0 0
40 0 97 0 153 56 0 0 0 15 201 0 2 0 0 24 308 0 47 30 17 0 22 649 0 0 0 0 0 0 16 5 0 0 70 0 49 0 0 0 0 386 0 214 10 30 807 0 0 10
41 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0
42 0 1545 0 291 50 0 0 0 10 2461 0 0 0 0 10 364 0 26 10 9 0 27 756 0 0 0 0 0 0 45 0 0 0 87 0 46 0 0 0 0 118 0 4577 2 11 1206 0 0 6
43 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 0
44 0 1 0 3 0 0 0 0 0 0 0 0 0 0 0 5 0 1 3 2 0 9 19 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 5 0 1 0 6 48 0 0 0
45 0 1269 0 2056 740 0 0 2 93 1715 0 13 0 0 77 4160 0 1159 118 130 6 251 5075 0 0 0 0 0 0 70 33 0 0 929 0 983 0 0 0 0 1055 0 2080 15 187 71370 0 0 49
46 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
47 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 2 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 1 0 2 8 0 0 0
48 0 29 0 4 0 0 0 0 0 57 0 0 0 0 0 7 0 0 0 0 0 0 9 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 65 0 0 12 0 0 0

BaggingClassifier

In [93]:
bgc = BaggingClassifier(random_state=41)
bgc.fit(XTrain, y1_train)  
y_pred_train = bgc.predict(XTrain)
y_pred_test = bgc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y1_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y1_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y1_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.993326978074 	 0.569632862842
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
0 1 30 9 0 0 0 0 4 42 87 14 86 115 1 158 20 4 0 4 0 23
1 7 1717 90 0 0 0 4 24 373 291 21 360 620 1 1567 110 62 0 20 2 204
2 3 240 441 0 2 0 1 26 255 112 15 456 382 2 1334 139 182 0 5 3 190
3 0 9 3 0 0 0 0 1 5 11 2 10 58 0 61 5 0 0 0 0 3
4 2 9 6 0 0 0 1 1 32 13 1 19 44 1 165 8 10 0 2 0 23
5 1 2 0 0 0 0 0 0 1 11 1 11 13 0 12 0 1 0 1 0 1
6 1 28 10 0 0 0 6 8 24 59 3 55 61 5 224 26 7 0 7 2 11
7 0 44 78 0 1 0 3 776 233 13 11 61 380 0 2308 207 307 0 13 2 214
8 15 327 150 1 2 0 2 125 1862 189 34 279 2091 8 6453 657 149 0 39 1 394
9 3 75 18 1 5 0 3 4 83 9880 37 92 267 7 899 24 85 0 13 0 49
10 10 76 30 0 1 0 0 12 211 197 51 111 767 2 993 54 38 0 7 2 40
11 1 133 135 1 2 0 6 14 221 500 52 3504 631 5 1460 161 21 0 22 0 234
12 19 155 83 0 5 0 2 75 660 137 33 287 19539 2 6265 477 74 0 26 4 147
13 1 17 6 0 0 0 1 0 25 121 2 79 35 17 229 26 2 0 6 3 20
14 21 481 242 0 5 0 7 277 1895 842 91 794 5112 21 49396 1566 1448 0 87 4 798
15 4 214 98 0 5 0 4 89 813 44 17 305 1442 7 10235 3234 482 0 28 1 587
16 0 54 86 0 3 0 0 89 276 3 10 45 253 0 3201 330 25342 0 1 2 91
17 0 1 1 0 0 0 0 0 3 0 0 1 17 0 102 7 1 0 0 0 0
18 5 174 40 0 1 0 3 25 367 202 28 273 1018 4 2428 436 12 0 227 1 183
19 1 56 30 0 0 0 3 14 130 156 13 177 338 1 1355 43 66 0 7 7 36
20 9 247 166 0 4 0 20 167 520 167 17 533 577 13 5155 829 280 0 32 3 2368
In [94]:
bgc = BaggingClassifier(random_state=41)
bgc.fit(XTrain, y2_train)  
y_pred_train = bgc.predict(XTrain)
y_pred_test = bgc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y2_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y2_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y2_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.988083889418 	 0.850993998951
0 1 2 3 4 5 6
0 129 0 5 833 2 159 1676
1 0 0 0 66 0 4 39
2 7 0 0 965 4 15 372
3 45 0 1 121556 71 192 4661
4 8 0 0 2869 58 76 2850
5 36 0 6 1605 25 462 6808
6 125 0 10 6341 116 971 54629
In [95]:
bgc = BaggingClassifier(random_state=41)
bgc.fit(XTrain, y3_train)  
y_pred_train = bgc.predict(XTrain)
y_pred_test = bgc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y3_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y3_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y3_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.990943755958 	 0.597799775743
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
1 0 1286 0 184 3 0 0 0 0 1222 0 0 0 0 0 55 0 18 0 0 0 2 297 0 0 0 0 0 0 0 0 0 0 1 0 3 0 0 0 0 10 0 1378 0 0 2574 0 0 0
2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0
3 0 138 0 389 21 0 0 0 0 237 0 0 0 0 0 216 0 34 1 0 0 3 554 0 0 0 0 0 0 0 0 0 0 11 0 6 0 0 0 0 12 0 260 0 0 5314 0 0 0
4 0 16 0 79 65 0 0 0 0 27 0 0 0 0 1 1493 0 26 0 0 0 3 383 0 0 0 0 0 0 0 0 0 0 2 0 69 0 0 0 0 12 0 32 0 0 2032 0 0 0
5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0
6 0 5 0 0 0 0 0 0 0 13 0 0 0 0 0 1 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 28 0 0 23 0 0 0
7 0 3 0 3 0 0 0 0 0 6 0 0 0 0 0 9 0 5 0 0 0 2 38 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 6 0 1 114 0 0 0
8 0 4 0 3 1 0 0 0 0 1 0 0 0 0 0 14 0 3 0 0 0 0 26 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 285 0 0 0
9 0 1324 0 223 8 0 0 0 0 1494 0 0 0 0 0 55 0 12 0 0 0 6 381 0 0 0 0 0 0 1 0 0 0 1 0 6 0 0 0 0 12 0 1469 0 0 3009 0 0 0
10 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 0
11 0 2 0 1 1 0 0 0 0 1 0 0 0 0 0 4 0 0 0 0 0 0 15 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 81 0 0 0
12 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 2 0 0 0 0 0 0 5 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 0
13 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 9 0 0 0
14 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 7 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 52 0 0 0
15 0 125 0 370 199 0 0 0 0 210 0 0 0 0 2 17592 0 61 0 0 0 13 1123 0 0 0 0 0 0 0 1 0 0 10 0 707 0 0 0 0 24 0 271 0 0 7232 0 0 0
16 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 6 0 0 0 0 0 0 18 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 79 0 0 0
17 0 26 0 118 18 0 0 0 0 36 0 0 0 0 3 304 0 157 0 1 1 1 388 0 0 0 0 0 0 0 0 0 0 3 0 5 0 0 0 0 2 0 81 0 0 2669 0 0 0
18 0 8 0 18 7 0 0 0 0 10 0 0 0 0 0 52 0 10 0 0 0 1 220 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 0 0 3 0 19 0 0 518 0 0 0
19 0 5 0 1 1 0 0 0 0 1 0 0 0 0 0 16 0 5 0 0 0 0 38 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 167 0 0 0
20 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0
21 0 27 0 12 4 0 0 0 0 18 0 0 0 0 0 13 0 2 0 0 0 1 124 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 3 0 22 0 1 533 0 0 0
22 0 181 0 293 44 0 0 1 0 224 0 0 0 0 4 644 0 47 2 1 0 11 15022 0 0 0 0 0 0 0 0 0 0 8 0 42 0 0 0 0 12 0 320 0 0 9990 0 0 0
23 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
24 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 2 0 0 0
25 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 3 0 0 0 0 0 0 8 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 37 0 0 0
26 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
27 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 0
28 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11 0 0 0
29 0 13 0 8 0 0 0 0 0 14 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 16 0 0 40 0 0 0
30 0 3 0 2 0 0 0 0 0 1 0 0 0 0 0 4 0 2 0 0 0 0 12 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 7 0 0 59 0 0 0
31 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
32 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 20 0 0 0
33 0 42 0 142 13 0 0 0 0 86 0 0 0 0 0 110 0 21 0 0 0 2 237 0 0 0 0 0 0 0 0 0 0 3 0 2 0 0 0 0 4 0 85 0 0 2433 0 0 0
34 0 4 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 7 0 0 0
35 0 14 0 60 32 0 0 0 0 38 0 0 0 0 1 5064 0 7 0 0 0 1 92 0 0 0 0 0 0 0 0 0 0 1 0 737 0 0 0 0 5 0 36 0 0 1223 0 0 0
36 0 0 0 2 2 0 0 0 0 3 0 0 0 0 0 5 0 8 0 0 0 2 30 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 6 0 0 107 0 0 0
37 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 9 0 0 0
38 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
39 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 13 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 1 0 4 0 0 67 0 0 1
40 0 88 0 137 20 0 0 0 0 116 0 0 0 0 3 237 0 30 0 0 1 10 563 0 0 0 0 0 0 1 2 0 0 0 0 23 0 0 0 0 20 0 154 0 0 1813 0 0 0
41 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0
42 0 1586 0 256 15 0 0 0 0 1559 0 0 0 0 0 150 0 37 0 0 0 8 666 0 0 0 0 0 0 2 0 0 0 14 0 7 0 0 0 0 21 0 2537 0 0 4799 0 0 0
43 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 9 0 0 0
44 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 1 0 0 0 0 21 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 78 0 0 0
45 0 482 0 691 162 0 0 0 1 674 0 0 0 0 9 2174 0 157 1 1 0 11 3246 0 0 0 0 0 0 1 2 0 0 27 0 122 0 0 0 0 55 0 900 0 0 84918 0 0 1
46 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
47 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 6 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 0
48 0 34 0 1 0 0 0 0 0 34 0 0 0 0 0 3 0 0 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 48 0 0 60 0 0 0

AdaBoostClassifier

In [96]:
abc = AdaBoostClassifier(random_state=41)
abc.fit(XTrain, y1_train)  
y_pred_train = abc.predict(XTrain)
y_pred_test = abc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y1_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y1_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y1_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.370829361296 	 0.379793741007
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 598 0 0 0 0 0 0
1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5472 0 1 0 0 0 0
2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3772 0 16 0 0 0 0
3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 168 0 0 0 0 0 0
4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 337 0 0 0 0 0 0
5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 55 0 0 0 0 0 0
6 0 0 0 0 0 0 0 0 0 0 0 0 0 0 536 0 1 0 0 0 0
7 0 0 0 0 0 0 0 6 0 0 0 0 0 0 4531 0 114 0 0 0 0
8 0 0 0 0 0 0 0 0 0 0 0 0 0 0 12708 0 70 0 0 0 0
9 0 0 0 0 0 0 0 1 0 0 0 0 0 0 11543 0 1 0 0 0 0
10 0 0 0 0 0 0 0 1 0 0 0 0 0 0 2601 0 0 0 0 0 0
11 0 0 0 0 0 0 0 0 0 0 0 0 0 0 7103 0 0 0 0 0 0
12 0 0 0 0 0 0 0 6 0 0 0 0 0 0 27921 0 63 0 0 0 0
13 0 0 0 0 0 0 0 0 0 0 0 0 0 0 590 0 0 0 0 0 0
14 0 0 0 0 0 0 0 4 0 0 0 0 0 0 61020 0 2063 0 0 0 0
15 0 0 0 0 0 0 0 0 0 0 0 0 0 0 17382 0 227 0 0 0 0
16 0 0 0 0 0 0 0 2 0 0 0 0 0 0 11890 0 17894 0 0 0 0
17 0 0 0 0 0 0 0 0 0 0 0 0 0 0 133 0 0 0 0 0 0
18 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5424 0 3 0 0 0 0
19 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2421 0 12 0 0 0 0
20 0 0 0 0 0 0 0 1 0 0 0 0 0 0 11056 0 50 0 0 0 0
In [97]:
abc = AdaBoostClassifier(random_state=41)
abc.fit(XTrain, y2_train)  
y_pred_train = abc.predict(XTrain)
y_pred_test = abc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y2_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y2_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y2_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.780266920877 	 0.790694764602
0 1 2 3 4 5 6
0 220 0 101 1019 0 0 1464
1 5 0 0 77 0 0 27
2 35 0 9 1051 0 0 268
3 2226 0 117 115211 0 0 8972
4 30 0 4 3160 0 0 2667
5 173 0 215 2441 0 0 6113
6 1152 0 1025 11151 0 0 48864
In [98]:
abc = AdaBoostClassifier(random_state=41)
abc.fit(XTrain, y3_train)  
y_pred_train = abc.predict(XTrain)
y_pred_test = abc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y3_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y3_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y3_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.491420400381 	 0.500733889325
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 167 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6866 0 0 0
2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0
3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 307 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6889 0 0 0
4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1937 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2303 0 0 0
5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
6 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 72 0 0 0
7 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 40 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 148 0 0 0
8 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 31 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 308 0 0 0
9 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 229 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 7772 0 0 0
10 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 8 0 0 0
11 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 43 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 65 0 0 0
12 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 9 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11 0 0 0
13 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 7 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 8 0 0 0
14 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 33 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 32 0 0 0
15 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 15884 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 12056 0 0 0
16 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 20 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 86 0 0 0
17 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 941 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2872 0 0 0
18 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 319 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 552 0 0 0
19 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 81 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 156 0 0 0
20 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 0
21 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 118 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 644 0 0 0
22 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11695 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 15151 0 0 0
23 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
24 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0
25 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 42 0 0 0
26 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
27 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
28 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 16 0 0 0
29 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 92 0 0 0
30 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 12 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 79 0 0 0
31 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
32 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 7 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 21 0 0 0
33 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 174 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3006 0 0 0
34 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 20 0 0 0
35 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4766 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2545 0 0 0
36 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 42 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 124 0 0 0
37 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 8 0 0 0
38 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
39 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 28 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 63 0 0 0
40 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 779 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2439 0 0 0
41 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0
42 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 622 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11035 0 0 0
43 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 9 0 0 0
44 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 27 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 77 0 0 0
45 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5468 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 88167 0 0 0
46 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
47 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 15 0 0 0
48 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 182 0 0 0

DecisionTree

In [99]:
for max_depth in range(5,20):
    decisionTree = DecisionTreeClassifier(max_depth=max_depth)
    decisionTree.fit(XTrain, y1_train)  
    y_pred_train = decisionTree.predict(XTrain)
    y_pred_test = decisionTree.predict(XTest)
    print(max_depth,"\t",train_size,"\t",pca_i, "\t",
          metrics.accuracy_score(y_true = y1_train, y_pred = y_pred_train),"\t",
          metrics.accuracy_score(y_true = y1_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y1_test, y_pred=y_pred_test)).to_html()))
5 	 0.01 	 50 	 0.517159199237 	 0.493212125295
6 	 0.01 	 50 	 0.561963775024 	 0.511455892048
7 	 0.01 	 50 	 0.60962821735 	 0.512692676025
8 	 0.01 	 50 	 0.661582459485 	 0.509997738177
9 	 0.01 	 50 	 0.725929456625 	 0.488082118606
10 	 0.01 	 50 	 0.787893231649 	 0.489684644148
11 	 0.01 	 50 	 0.84795042898 	 0.488544107952
12 	 0.01 	 50 	 0.88894184938 	 0.476450574359
13 	 0.01 	 50 	 0.922306959009 	 0.480093552842
14 	 0.01 	 50 	 0.949475691134 	 0.475743153174
15 	 0.01 	 50 	 0.968541468065 	 0.477345678715
16 	 0.01 	 50 	 0.978551000953 	 0.46985278902
17 	 0.01 	 50 	 0.986653956149 	 0.469357112952
18 	 0.01 	 50 	 0.989990467112 	 0.465593824742
19 	 0.01 	 50 	 0.992373689228 	 0.465199208843
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
0 9 35 9 2 0 0 1 7 46 63 28 66 78 12 147 24 3 1 17 5 45
1 52 1161 162 1 11 0 11 88 371 169 128 583 430 16 1104 415 145 5 73 85 463
2 62 396 218 5 24 0 13 77 311 49 106 412 356 9 813 370 190 2 45 53 277
3 3 7 4 0 0 0 0 3 13 11 7 18 35 1 39 10 0 1 5 2 9
4 2 14 8 0 0 0 1 5 35 9 15 24 41 3 88 34 14 1 7 2 34
5 0 2 0 0 0 0 0 0 3 4 3 6 11 0 14 4 0 0 4 0 4
6 4 12 8 2 0 0 3 13 59 35 22 52 50 1 184 30 14 3 15 5 25
7 66 105 128 2 5 0 10 403 405 22 44 157 306 11 1555 509 350 5 55 71 442
8 61 412 292 11 34 0 9 399 1741 184 181 658 1764 30 3777 1508 301 8 228 195 985
9 21 166 62 3 5 0 3 18 249 9006 132 258 261 10 774 169 40 1 142 25 200
10 30 117 47 6 5 0 6 56 259 140 159 149 480 21 611 177 33 8 57 50 191
11 67 197 154 9 6 0 6 87 355 284 295 2569 524 65 1457 343 66 3 135 121 360
12 154 330 250 39 92 0 7 340 1376 299 469 673 15437 66 5184 1439 208 108 297 323 899
13 9 25 6 0 0 0 0 15 50 82 24 78 26 11 176 24 0 0 27 5 32
14 260 1013 745 40 50 0 55 1503 3996 785 943 2082 4694 212 34787 5196 1903 24 668 651 3480
15 68 290 328 8 20 0 25 478 1350 65 309 661 1698 17 5822 3701 753 0 305 153 1558
16 92 154 251 0 7 0 20 336 386 9 104 233 366 0 2182 715 24368 0 62 98 403
17 0 4 0 1 0 0 0 2 7 0 1 5 11 1 92 4 1 0 1 0 3
18 13 129 101 14 2 0 11 132 618 265 118 316 820 40 1454 483 70 28 452 41 320
19 18 73 52 11 5 0 14 56 228 97 104 216 268 32 808 159 70 3 45 66 108
20 137 390 297 1 7 0 26 266 699 149 153 775 634 59 2870 1338 392 0 168 170 2576
In [100]:
for max_depth in range(5,20):
    decisionTree = DecisionTreeClassifier(max_depth=max_depth)
    decisionTree.fit(XTrain, y2_train)  
    y_pred_train = decisionTree.predict(XTrain)
    y_pred_test = decisionTree.predict(XTest)
    print(max_depth,"\t",train_size,"\t",pca_i, "\t",
          metrics.accuracy_score(y_true = y2_train, y_pred = y_pred_train),"\t",
          metrics.accuracy_score(y_true = y2_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y2_test, y_pred=y_pred_test)).to_html()))
5 	 0.01 	 50 	 0.879408960915 	 0.835748350554
6 	 0.01 	 50 	 0.902764537655 	 0.833294032156
7 	 0.01 	 50 	 0.923736892278 	 0.828154400689
8 	 0.01 	 50 	 0.940419447092 	 0.820709634884
9 	 0.01 	 50 	 0.950428979981 	 0.817085905956
10 	 0.01 	 50 	 0.959485224023 	 0.813572861976
11 	 0.01 	 50 	 0.967588179218 	 0.813452552251
12 	 0.01 	 50 	 0.97664442326 	 0.799766117894
13 	 0.01 	 50 	 0.983317445186 	 0.800213670072
14 	 0.01 	 50 	 0.987607244995 	 0.797018243767
15 	 0.01 	 50 	 0.992373689228 	 0.794818981987
16 	 0.01 	 50 	 0.995233555767 	 0.797744914508
17 	 0.01 	 50 	 0.996663489037 	 0.794972978436
18 	 0.01 	 50 	 0.998093422307 	 0.797966284403
19 	 0.01 	 50 	 0.99857006673 	 0.796036516408
0 1 2 3 4 5 6
0 195 0 68 798 68 376 1299
1 0 0 4 54 3 14 34
2 23 0 54 842 41 76 327
3 499 2 1215 115995 1392 1527 5896
4 75 0 56 2679 483 455 2113
5 172 0 78 1747 268 1102 5575
6 779 0 607 7587 1579 4055 47585
In [101]:
for max_depth in range(5,20):
    decisionTree = DecisionTreeClassifier(max_depth=max_depth)
    decisionTree.fit(XTrain, y3_train)  
    y_pred_train = decisionTree.predict(XTrain)
    y_pred_test = decisionTree.predict(XTest)
    print(max_depth,"\t",train_size,"\t",pca_i, "\t",
          metrics.accuracy_score(y_true = y3_train, y_pred = y_pred_train),"\t",
          metrics.accuracy_score(y_true = y3_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y3_test, y_pred=y_pred_test)).to_html()))
5 	 0.01 	 50 	 0.60819828408 	 0.563968680972
6 	 0.01 	 50 	 0.630600571973 	 0.553039745521
7 	 0.01 	 50 	 0.683984747378 	 0.539593930615
8 	 0.01 	 50 	 0.720209723546 	 0.52785170142
9 	 0.01 	 50 	 0.772640610105 	 0.524863207842
10 	 0.01 	 50 	 0.814585319352 	 0.516417465122
11 	 0.01 	 50 	 0.853670162059 	 0.503799381127
12 	 0.01 	 50 	 0.878455672069 	 0.493216937684
13 	 0.01 	 50 	 0.902764537655 	 0.486364095728
14 	 0.01 	 50 	 0.922783603432 	 0.482134005784
15 	 0.01 	 50 	 0.944232602479 	 0.476219579686
16 	 0.01 	 50 	 0.961868446139 	 0.476898126537
17 	 0.01 	 50 	 0.973784556721 	 0.471522688008
18 	 0.01 	 50 	 0.979980934223 	 0.468332074092
19 	 0.01 	 50 	 0.986653956149 	 0.468067392696
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48
0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 0 1405 0 336 83 0 0 2 4 1043 0 9 0 0 7 251 0 91 9 3 2 26 547 0 0 0 0 0 0 26 0 0 0 23 0 84 0 0 0 0 86 0 1200 3 0 1793 0 0 0
2 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0
3 0 261 0 501 143 0 0 1 11 376 0 6 0 0 14 468 0 199 25 3 3 13 948 0 0 0 0 0 0 20 1 0 0 81 0 110 0 0 0 0 118 0 335 3 1 3555 0 0 0
4 0 73 0 115 331 0 0 6 9 98 0 1 0 0 42 1199 0 118 10 4 0 26 360 0 0 0 0 0 0 13 4 0 0 27 0 240 0 0 0 0 58 0 105 2 0 1399 0 0 0
5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
6 0 14 0 8 0 0 0 0 0 7 0 0 0 0 0 3 0 1 0 0 0 0 9 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 11 0 0 18 0 0 0
7 0 6 0 13 8 0 0 0 0 2 0 0 0 0 0 8 0 4 0 0 0 6 27 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 17 0 10 0 0 85 0 0 0
8 0 6 0 12 11 0 0 0 2 11 0 0 0 0 1 13 0 7 1 1 0 4 43 0 0 0 0 0 0 2 2 0 0 3 0 3 0 0 0 0 7 0 8 0 0 202 0 0 0
9 0 1532 0 412 110 0 0 0 7 1311 0 3 0 0 10 299 0 138 12 1 0 26 569 0 0 0 0 0 0 25 0 0 0 56 0 86 0 0 0 0 132 0 1227 6 0 2038 0 0 1
10 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 3 0 0 0
11 0 1 0 4 13 0 0 0 0 1 0 0 0 0 0 4 0 1 0 1 0 1 22 0 0 0 0 0 0 2 0 0 0 0 0 1 0 0 0 0 4 0 6 0 0 47 0 0 0
12 0 2 0 2 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 7 0 0 0
13 0 1 0 0 2 0 0 0 0 2 0 0 0 0 0 1 0 0 0 0 0 1 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 3 0 0 0
14 0 1 0 1 3 0 0 0 0 0 0 0 0 0 2 10 0 2 1 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 2 0 5 0 0 32 0 0 0
15 0 341 0 725 1799 0 0 14 30 527 0 9 0 0 84 13450 0 336 34 24 3 106 1655 0 0 0 0 0 0 29 7 0 0 114 0 2754 0 0 0 0 298 0 451 2 9 5138 0 0 1
16 0 2 0 13 4 0 0 0 2 2 0 0 0 0 0 7 0 5 0 0 0 5 20 0 0 0 0 0 0 1 0 0 0 1 0 1 0 0 0 0 6 0 4 0 0 33 0 0 0
17 0 64 0 160 94 0 0 17 18 109 0 1 0 0 11 442 0 320 10 6 13 15 393 0 0 0 0 0 0 30 23 0 0 41 0 58 0 0 0 0 82 0 153 0 0 1752 0 0 1
18 0 31 0 38 54 0 0 0 8 33 0 0 0 0 2 56 0 22 11 1 2 14 162 0 0 0 0 0 0 1 2 0 0 9 0 11 0 0 0 0 25 0 32 0 0 357 0 0 0
19 0 6 0 6 12 0 0 0 0 9 0 0 0 0 3 18 0 10 0 0 0 2 39 0 0 0 0 0 0 0 1 0 0 4 0 5 0 0 0 0 4 0 10 0 0 108 0 0 0
20 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0
21 0 30 0 33 17 0 0 2 2 26 0 1 0 0 6 45 0 15 3 2 0 42 118 0 0 0 0 0 0 0 0 0 0 7 0 11 0 0 0 0 51 0 34 0 0 317 0 0 0
22 0 358 0 969 588 0 0 32 58 545 0 11 0 0 82 1390 0 343 51 43 4 140 12889 0 0 0 0 0 0 43 30 0 0 157 0 348 0 0 0 0 576 0 581 9 1 7593 0 0 5
23 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
24 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 3 0 0 0
25 0 1 0 4 3 0 0 0 0 1 0 0 0 0 2 1 0 2 0 0 0 1 9 0 0 0 0 0 0 0 1 0 0 0 0 2 0 0 0 0 1 0 1 0 0 23 0 0 0
26 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0
27 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
28 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 2 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 12 0 0 0
29 0 13 0 1 0 0 0 0 0 12 0 0 0 0 0 3 0 1 3 0 0 1 7 0 0 0 0 0 0 1 0 0 0 1 0 2 0 0 0 0 1 0 15 0 0 32 0 0 0
30 0 3 0 6 4 0 0 0 1 1 0 0 0 0 0 6 0 6 0 0 0 1 12 0 0 0 0 0 0 0 1 0 0 1 0 5 0 0 0 0 8 0 4 0 0 32 0 0 0
31 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
32 0 0 0 1 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 17 0 0 0
33 0 97 0 241 76 0 0 0 2 155 0 4 0 0 3 185 0 95 13 0 1 6 362 0 0 0 0 0 0 16 3 0 0 42 0 42 0 0 0 0 31 0 106 1 2 1697 0 0 0
34 0 5 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 3 0 0 5 0 0 0
35 0 63 0 128 413 0 0 0 9 124 0 7 0 0 12 3461 0 67 4 1 0 18 252 0 0 0 0 0 0 1 0 0 0 18 0 1558 0 0 0 0 42 0 84 0 0 1048 0 0 1
36 0 6 0 5 10 0 0 1 0 8 0 0 0 0 0 11 0 4 1 1 0 4 28 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 6 0 11 0 0 67 0 0 0
37 0 1 0 1 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 4 0 0 0
38 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
39 0 3 0 1 5 0 0 0 0 1 0 0 0 0 0 8 0 3 3 0 1 1 9 0 0 0 0 0 0 0 0 0 0 3 0 3 0 0 0 0 4 0 1 0 0 45 0 0 0
40 0 130 0 200 112 0 0 12 7 179 0 1 0 0 33 302 0 90 22 10 2 62 502 0 0 0 0 0 0 4 4 0 0 44 0 66 0 0 0 0 112 0 194 3 0 1127 0 0 0
41 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
42 0 1895 0 571 186 0 0 8 11 1457 0 5 0 0 16 414 0 156 37 5 0 66 1014 0 0 0 0 0 0 46 0 0 0 94 0 150 0 0 0 0 202 0 1983 6 0 3334 0 0 1
43 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 2 0 0 0
44 0 0 0 3 2 0 0 0 0 2 0 0 0 0 1 3 0 2 1 0 0 6 21 0 0 0 0 0 0 0 0 0 0 3 0 1 0 0 0 0 11 0 4 0 0 44 0 0 0
45 0 1825 0 3311 2217 0 0 24 165 2547 0 44 0 0 82 3915 0 1657 360 36 29 247 7812 0 0 0 0 0 0 131 57 0 0 655 0 1466 0 0 0 0 1285 0 2410 10 19 63302 0 0 29
46 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
47 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 3 0 0 10 0 0 0
48 0 46 0 3 1 0 0 0 0 29 0 0 0 0 0 10 0 3 1 0 0 1 9 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 1 0 33 0 0 46 0 0 0

RandomForestClassifier

In [102]:
rfc = RandomForestClassifier( random_state=41,n_jobs=-1)
rfc.fit(XTrain, y1_train)  
y_pred_train = rfc.predict(XTrain)
y_pred_test = rfc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y1_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y1_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y1_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.992850333651 	 0.551932896048
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
0 0 21 16 0 0 0 1 6 39 101 3 46 102 0 201 23 15 0 0 1 23
1 3 1445 148 0 6 0 5 24 404 351 14 198 574 0 1881 146 128 0 11 3 132
2 5 178 631 0 1 0 4 35 242 151 18 241 404 1 1272 159 354 0 7 1 84
3 0 6 0 0 0 0 0 1 5 23 1 9 55 0 62 2 2 0 0 0 2
4 0 6 7 0 0 0 0 2 33 14 0 16 52 1 168 13 13 0 2 0 10
5 0 3 2 0 0 0 0 1 5 14 0 3 9 1 13 1 3 0 0 0 0
6 0 17 9 0 0 0 7 11 38 87 1 28 90 1 206 17 16 0 0 0 9
7 0 65 27 0 2 0 0 392 375 56 7 74 594 2 2482 112 313 0 5 3 142
8 4 255 82 1 3 0 4 132 1867 147 12 219 2285 0 6651 577 215 0 45 1 278
9 2 66 43 0 2 0 6 13 152 9714 10 82 315 0 925 46 127 0 1 0 41
10 2 69 56 0 2 0 3 14 245 258 39 47 713 1 969 44 93 0 5 2 40
11 2 122 181 0 10 0 3 34 286 368 16 2597 940 7 1967 238 139 0 16 2 175
12 1 120 62 0 7 0 2 84 591 316 39 280 19442 2 6476 254 178 0 13 3 120
13 0 19 8 0 0 0 3 4 46 88 3 65 56 7 242 25 9 0 4 0 11
14 5 346 209 8 5 0 10 261 1796 924 66 638 6506 6 49227 1250 1258 0 43 10 519
15 2 156 84 1 1 0 4 83 619 106 22 313 1941 2 10431 2730 567 0 50 3 494
16 0 67 68 0 0 0 2 84 243 111 15 63 662 0 3515 230 24665 0 11 0 50
17 0 1 0 0 0 0 0 0 2 2 0 0 48 0 60 17 1 0 0 0 2
18 2 66 24 0 2 0 4 67 488 216 17 187 975 3 2475 556 68 0 135 1 141
19 1 43 68 0 0 0 3 23 194 191 6 111 402 1 1189 46 119 0 13 2 21
20 3 197 103 0 4 0 7 75 520 270 22 392 736 9 5677 889 383 0 30 0 1790
In [103]:
rfc = RandomForestClassifier( random_state=41,n_jobs=-1)
rfc.fit(XTrain, y2_train)  
y_pred_train = rfc.predict(XTrain)
y_pred_test = rfc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y2_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y2_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y2_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.988560533842 	 0.832976414481
0 1 2 3 4 5 6
0 25 0 1 1203 1 43 1531
1 0 0 0 82 1 1 25
2 0 0 0 1126 1 3 233
3 7 0 1 123325 13 23 3157
4 1 0 0 3753 15 19 2073
5 5 0 0 2957 3 112 5865
6 25 0 0 12281 29 244 49613
In [104]:
rfc = RandomForestClassifier( random_state=41,n_jobs=-1)
rfc.fit(XTrain, y3_train)  
y_pred_train = rfc.predict(XTrain)
y_pred_test = rfc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y3_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y3_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y3_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.994280266921 	 0.580951601804
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 0 894 0 179 8 0 0 0 0 1232 0 0 0 0 1 109 0 21 0 0 0 2 415 0 0 0 0 0 0 3 0 0 0 3 0 10 0 0 0 0 9 0 1023 0 0 3124 0 0 0
2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0
3 0 125 0 319 8 0 0 0 0 195 0 0 0 0 0 202 0 36 0 0 0 0 521 0 0 0 0 0 0 0 0 0 0 5 0 12 0 0 0 0 11 0 190 0 0 5572 0 0 0
4 0 14 0 45 56 0 0 0 0 37 0 0 0 0 0 1314 0 17 0 0 0 1 365 0 0 0 0 0 0 0 0 0 0 0 0 71 0 0 0 0 0 0 28 0 0 2292 0 0 0
5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
6 0 8 0 4 0 0 0 0 0 14 0 0 0 0 0 0 0 0 0 0 0 0 9 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 33 0 0 0
7 0 4 0 5 0 0 0 0 0 8 0 0 0 0 0 12 0 2 0 0 0 1 41 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 105 0 0 0
8 0 2 0 5 0 0 0 0 0 1 0 0 0 0 0 10 0 0 0 0 0 0 26 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 294 0 0 0
9 0 1020 0 220 12 0 0 0 0 1503 0 0 0 0 0 155 0 17 2 0 0 0 449 0 0 0 0 0 0 0 0 0 0 6 0 9 0 0 0 0 17 0 1058 0 0 3533 0 0 0
10 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 7 0 0 0
11 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 5 0 0 0 0 0 0 22 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 79 0 0 0
12 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 14 0 0 0
13 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 0
14 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 12 0 0 0 0 0 0 6 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 46 0 0 0
15 0 152 0 280 161 0 0 0 1 260 0 0 0 0 2 16466 0 71 0 0 0 4 1242 0 0 0 0 0 0 0 0 0 0 7 0 721 0 0 0 0 8 0 172 0 0 8393 0 0 0
16 0 0 0 1 1 0 0 0 0 3 0 0 0 0 0 6 0 3 0 0 0 0 20 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 1 0 2 0 0 67 0 0 0
17 0 26 0 86 15 0 0 0 0 59 0 0 0 0 0 215 0 112 1 0 0 0 318 0 0 0 0 0 0 0 0 0 0 3 0 12 0 0 0 0 3 0 44 0 0 2919 0 0 0
18 0 4 0 16 4 0 0 0 0 16 0 0 0 0 0 65 0 11 0 0 0 2 181 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 25 0 0 545 0 0 0
19 0 3 0 2 0 0 0 0 0 2 0 0 0 0 0 15 0 3 0 0 1 0 42 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 6 0 0 161 0 0 0
20 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0
21 0 8 0 25 1 0 0 0 0 20 0 0 0 0 0 34 0 2 0 1 0 1 125 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 3 0 14 0 0 527 0 0 0
22 0 184 0 259 46 0 0 0 2 282 0 0 0 0 2 831 0 90 2 1 0 8 14061 0 0 0 0 0 0 0 0 0 0 6 0 40 0 0 0 0 18 0 245 0 0 10769 0 0 0
23 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0
24 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 0
25 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 1 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 43 0 0 0
26 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0
27 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 0
28 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 0
29 0 8 0 2 0 0 0 0 0 8 0 0 0 0 0 1 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 15 0 0 55 0 0 0
30 0 3 0 1 0 0 0 0 0 6 0 0 0 0 0 5 0 0 0 0 0 1 20 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 50 0 0 0
31 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0
32 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 22 0 0 0
33 0 36 0 137 4 0 0 0 1 71 0 0 0 0 0 112 0 25 0 0 0 0 204 0 0 0 0 0 0 0 0 0 0 2 0 6 0 0 0 0 4 0 39 0 0 2539 0 0 0
34 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 8 0 0 0
35 0 31 0 37 28 0 0 0 0 58 0 0 0 0 0 5061 0 9 0 0 0 0 197 0 0 0 0 0 0 0 0 0 0 1 0 390 0 0 0 0 1 0 24 0 0 1474 0 0 0
36 0 6 0 2 1 0 0 0 0 9 0 0 0 0 0 10 0 2 0 0 1 0 33 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 7 0 0 94 0 0 0
37 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 3 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 0
38 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
39 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 6 0 1 0 0 0 0 16 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 64 0 0 0
40 0 82 0 130 9 0 0 0 0 150 0 0 0 0 2 273 0 33 1 0 1 4 524 0 0 0 0 0 0 0 0 0 0 2 0 13 0 0 0 0 11 0 103 0 0 1880 0 0 0
41 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
42 0 1132 0 261 16 0 0 0 1 1630 0 0 0 0 0 315 0 35 0 0 0 1 941 0 0 0 0 0 0 1 0 0 0 9 0 13 0 0 0 0 23 0 1591 0 0 5688 0 0 0
43 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 8 0 0 0
44 0 1 0 0 0 0 0 0 0 3 0 0 0 0 0 3 0 0 0 0 0 0 18 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 77 0 0 0
45 0 478 0 632 89 0 0 1 2 695 0 1 0 0 0 2023 0 197 2 1 3 3 3465 0 0 0 0 0 0 1 0 0 0 17 0 100 0 0 0 0 34 0 576 0 1 85314 0 0 0
46 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
47 0 1 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 11 0 0 0
48 0 29 0 3 0 0 0 0 0 36 0 0 0 0 0 1 0 1 0 0 0 0 13 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 22 0 0 80 0 0 0

ExtraTreesClassifier

In [105]:
etc = ExtraTreesClassifier( random_state=41,n_jobs=-1)
etc.fit(XTrain, y1_train)  
y_pred_train = etc.predict(XTrain)
y_pred_test = etc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y1_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y1_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y1_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 1.0 	 0.518241360559
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
0 3 25 8 0 0 0 0 3 34 58 3 61 135 1 204 9 37 0 2 1 14
1 6 1486 74 0 1 0 0 28 300 165 24 260 678 2 2010 142 202 0 11 3 81
2 10 187 370 0 4 0 1 39 208 108 20 381 556 0 1281 109 447 0 4 1 62
3 0 4 1 0 1 0 0 1 3 15 5 13 71 0 41 0 10 0 3 0 0
4 0 9 6 0 0 0 0 1 23 15 3 14 68 0 150 17 21 0 3 0 7
5 0 1 2 0 0 0 0 0 3 10 2 4 17 0 13 0 3 0 0 0 0
6 0 17 6 0 0 0 0 7 19 65 5 44 106 4 214 13 25 0 4 2 6
7 0 68 46 2 0 0 1 516 325 70 12 83 667 0 2287 117 363 0 11 2 81
8 12 355 101 0 3 0 1 124 1770 104 26 303 2412 1 6517 467 335 0 31 4 212
9 10 74 10 1 2 0 0 20 80 9423 17 106 356 0 1147 27 231 0 6 1 34
10 11 76 16 2 0 0 0 28 193 187 26 119 719 1 974 46 173 0 6 5 20
11 9 176 152 0 1 0 0 33 171 264 15 3077 1027 2 1629 145 245 0 27 1 129
12 18 166 77 0 3 0 0 111 698 379 57 400 18171 2 7027 358 397 0 12 14 100
13 2 25 1 1 0 0 0 8 25 70 2 64 98 12 223 20 29 0 4 0 6
14 24 522 173 5 5 0 0 408 1958 719 77 889 7303 10 47586 1185 1652 0 64 10 497
15 5 187 132 1 4 0 1 119 780 89 38 447 2653 4 9872 2063 757 0 27 1 429
16 1 162 84 0 1 0 1 153 292 235 18 136 1216 1 5364 273 21755 0 8 0 86
17 0 2 0 0 0 0 0 1 2 0 0 4 16 0 103 2 0 0 0 0 3
18 5 107 38 1 0 0 0 79 399 163 16 227 1014 2 2513 298 212 0 265 1 87
19 1 57 15 0 0 0 0 28 114 138 11 153 533 0 1154 35 150 0 4 15 25
20 11 290 85 0 0 0 0 113 551 180 20 555 1224 2 5729 637 524 0 34 1 1151
In [106]:
etc = ExtraTreesClassifier( random_state=41,n_jobs=-1)
etc.fit(XTrain, y2_train)  
y_pred_train = etc.predict(XTrain)
y_pred_test = etc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y2_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y2_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y2_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 1.0 	 0.814164785825
0 1 2 3 4 5 6
0 30 0 1 1506 0 21 1246
1 0 0 0 86 0 0 23
2 2 0 0 1184 0 1 176
3 4 0 0 123772 2 25 2723
4 1 0 0 3993 12 12 1843
5 2 0 0 3640 6 94 5200
6 22 0 2 16650 31 214 45273
In [107]:
etc = ExtraTreesClassifier( random_state=41,n_jobs=-1)
etc.fit(XTrain, y3_train)  
y_pred_train = etc.predict(XTrain)
y_pred_test = etc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y3_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y3_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y3_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 1.0 	 0.561620235133
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 0 715 0 162 7 0 0 0 0 779 0 0 0 0 0 233 0 10 0 0 0 0 481 0 0 0 0 0 0 0 0 0 0 6 0 8 0 0 0 0 23 0 832 0 0 3777 0 0 0
2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 0
3 0 117 0 272 18 0 0 0 0 151 0 0 0 0 0 283 0 17 0 0 0 0 512 0 0 0 0 0 0 0 0 0 0 16 0 11 0 0 0 0 16 0 153 0 0 5630 0 0 0
4 0 13 0 71 46 0 0 0 1 35 0 0 0 0 5 1006 0 8 1 0 0 0 334 0 0 0 0 0 0 0 0 0 0 5 0 86 0 0 0 0 2 0 32 0 0 2595 0 0 0
5 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
6 0 10 0 1 0 0 0 0 0 6 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 8 0 0 45 0 0 0
7 0 2 0 1 0 0 0 0 0 3 0 0 0 0 0 9 0 1 0 0 0 0 22 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 149 0 0 0
8 0 2 0 2 0 0 0 0 0 1 0 0 0 0 0 11 0 0 0 0 0 0 25 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 296 0 0 0
9 0 799 0 186 8 0 0 0 0 977 0 0 0 0 1 270 0 20 0 0 0 3 508 0 0 0 0 0 0 1 0 0 0 6 0 16 0 0 0 0 25 0 935 0 0 4246 0 0 0
10 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 0
11 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 14 0 1 0 0 0 0 12 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 2 0 0 77 0 0 0
12 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 6 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 10 0 0 0
13 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 0
14 0 0 0 2 1 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 54 0 0 0
15 0 152 0 334 160 0 0 0 1 195 0 1 0 0 6 14201 0 36 0 0 0 3 1372 0 0 0 0 0 0 0 0 0 0 8 0 1019 0 0 0 0 14 0 179 0 0 10259 0 0 0
16 0 1 0 0 0 0 0 0 0 5 0 0 0 0 0 3 0 0 0 0 0 0 19 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 77 0 0 0
17 0 20 0 97 17 0 0 0 0 39 0 0 0 0 2 244 0 195 0 0 0 0 246 0 0 0 0 0 0 0 0 0 0 7 0 10 0 0 0 0 3 0 55 0 0 2878 0 0 0
18 0 14 0 16 2 0 0 0 0 13 0 0 0 0 0 78 0 0 0 0 0 1 144 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 10 0 0 591 0 0 0
19 0 2 0 4 1 0 0 0 0 1 0 0 0 0 0 15 0 1 0 0 0 0 25 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 184 0 0 0
20 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 7 0 0 0
21 0 22 0 8 0 0 0 0 0 11 0 0 0 0 0 44 0 3 0 0 0 1 95 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 11 0 0 565 0 0 0
22 0 134 0 263 27 0 0 0 0 229 0 0 0 0 0 978 0 40 2 0 0 2 12944 0 0 0 0 0 0 0 0 0 0 6 0 60 0 0 0 0 20 0 215 0 0 11926 0 0 0
23 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
24 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0
25 0 1 0 2 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 42 0 0 0
26 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0
27 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0
28 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 11 0 0 0
29 0 5 0 1 1 0 0 0 0 10 0 0 0 0 0 2 0 0 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11 0 0 58 0 0 0
30 0 2 0 2 0 0 0 0 0 1 0 0 0 0 0 5 0 0 0 0 0 0 11 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 67 0 0 0
31 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
32 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 22 0 0 0
33 0 32 0 123 9 0 0 0 0 41 0 0 0 0 0 115 0 11 0 0 0 2 194 0 0 0 0 0 0 0 0 0 0 2 0 5 0 0 0 0 5 0 60 0 0 2581 0 0 0
34 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 11 0 0 0
35 0 25 0 69 31 0 0 0 0 36 0 0 0 0 0 4332 0 4 0 0 0 1 292 0 0 0 0 0 0 0 0 0 0 0 0 443 0 0 0 0 0 0 31 0 0 2047 0 0 0
36 0 1 0 3 1 0 0 0 0 4 0 0 0 0 0 12 0 0 0 0 0 0 25 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 3 0 0 116 0 0 0
37 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 0
38 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
39 0 1 0 3 0 0 0 0 0 1 0 0 0 0 0 4 0 1 0 0 0 0 7 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 2 0 0 71 0 0 0
40 0 69 0 94 12 0 0 0 0 76 0 0 0 0 0 293 0 12 0 0 0 0 454 0 0 0 0 0 0 0 0 0 0 4 0 22 0 0 0 0 13 0 73 0 0 2096 0 0 0
41 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0
42 0 918 0 300 27 0 0 1 0 1076 0 0 0 0 1 452 0 26 1 0 0 1 909 0 0 0 0 0 0 4 0 0 0 7 0 23 0 0 0 0 50 0 1432 0 0 6429 0 0 0
43 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 10 0 0 0
44 0 1 0 1 0 0 0 0 0 2 0 0 0 0 0 3 0 1 0 0 0 0 14 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 81 0 0 0
45 0 396 0 657 90 0 0 0 2 535 0 0 0 0 3 2308 0 133 1 1 0 7 3213 0 0 0 0 0 0 3 0 0 0 27 0 153 0 0 0 0 62 0 582 0 0 85462 0 0 0
46 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
47 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 12 0 0 0
48 0 17 0 3 0 0 0 0 0 28 0 0 0 0 0 8 0 0 0 0 0 0 9 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 17 0 0 103 0 0 0

GradientBoostingClassifier

In [108]:
gbc = GradientBoostingClassifier()
gbc.fit(XTrain, y1_train)  
y_pred_train = gbc.predict(XTrain)
y_pred_test = gbc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y1_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y1_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y1_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.999523355577 	 0.636501008195
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
0 1 9 6 0 0 0 0 1 24 47 4 32 125 0 248 14 5 0 12 0 70
1 0 1403 36 0 0 0 0 7 331 90 28 235 582 1 2391 119 50 0 20 1 179
2 0 27 687 0 0 0 0 4 164 28 10 396 573 1 1498 109 196 0 2 1 92
3 0 1 0 0 0 0 0 1 0 5 1 4 70 0 76 5 0 0 1 0 4
4 0 1 6 0 0 0 0 0 26 5 0 9 61 0 199 9 7 0 0 0 14
5 0 3 0 0 0 0 0 0 1 7 0 6 14 0 23 1 0 0 0 0 0
6 0 3 4 0 0 0 1 7 12 45 6 31 84 0 279 29 7 0 17 0 12
7 0 15 6 0 0 0 0 726 140 3 5 25 354 2 2748 90 363 0 7 0 167
8 0 47 50 0 0 0 0 19 2495 36 6 56 2123 0 6869 630 174 0 24 1 248
9 0 5 2 0 0 0 0 6 31 9316 15 31 319 3 1717 46 23 0 5 3 23
10 0 8 9 0 0 0 0 2 272 88 30 25 888 0 1159 41 24 0 5 1 50
11 0 16 86 0 0 0 0 3 24 135 5 4043 674 3 1734 151 8 0 16 1 204
12 0 10 15 0 0 0 0 9 80 27 13 50 22394 5 5021 219 67 0 8 9 63
13 0 3 0 0 0 0 0 0 9 80 6 63 33 1 340 21 1 0 3 0 30
14 2 77 50 0 0 0 0 102 719 306 42 214 4635 17 53862 1110 1360 0 62 36 493
15 0 27 21 0 0 0 0 11 155 12 6 61 1288 1 9177 5779 633 0 41 3 394
16 0 10 9 0 0 0 0 11 79 0 13 4 205 3 2611 231 26560 0 10 1 39
17 0 1 0 0 0 0 0 0 3 0 0 0 11 0 114 1 1 0 0 0 2
18 1 11 4 0 0 0 0 11 246 90 9 61 769 0 2653 686 23 0 665 1 197
19 0 7 15 0 0 0 0 3 44 71 13 82 425 2 1619 25 68 0 2 46 11
20 0 106 23 0 0 0 0 47 129 48 6 198 305 5 5063 657 251 0 13 2 4254
In [109]:
gbc = GradientBoostingClassifier()
gbc.fit(XTrain, y2_train)  
y_pred_train = gbc.predict(XTrain)
y_pred_test = gbc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y2_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y2_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y2_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.99714013346 	 0.870436050569
0 1 2 3 4 5 6
0 116 0 17 724 0 117 1830
1 6 0 0 60 0 2 41
2 7 0 2 993 2 18 341
3 37 0 37 123054 60 93 3245
4 2 0 5 2879 112 38 2825
5 37 0 13 1600 18 331 6943
6 73 0 40 4577 23 220 57259
In [110]:
gbc = GradientBoostingClassifier()
gbc.fit(XTrain, y3_train)  
y_pred_train = gbc.predict(XTrain)
y_pred_test = gbc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y3_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y3_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y3_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.99714013346 	 0.624465223271
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
1 0 883 0 68 5 0 0 0 0 1282 0 0 0 0 0 31 0 2 0 0 0 3 334 0 0 0 0 0 0 0 0 0 0 9 0 4 0 0 0 0 23 0 1861 0 0 2528 0 0 0
2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0
3 0 53 0 427 10 0 0 0 0 119 0 0 0 0 0 156 0 11 0 0 0 5 434 0 0 0 0 0 0 0 0 0 0 14 0 9 0 0 0 0 36 0 229 0 0 5693 0 0 0
4 0 4 0 34 45 0 0 0 0 6 0 0 0 0 0 1463 0 3 0 0 0 4 255 0 0 0 0 0 0 0 0 0 0 2 0 75 0 0 0 0 25 0 54 0 0 2270 0 0 0
5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0
6 0 7 0 0 0 0 0 0 0 10 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 24 0 0 31 0 0 0
7 0 3 0 1 1 0 0 0 0 3 0 0 0 0 0 1 0 0 1 0 0 0 40 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 135 0 0 0
8 0 1 0 4 0 0 0 0 0 0 0 0 0 0 0 9 0 2 0 0 0 1 12 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 2 0 2 0 0 304 0 0 0
9 0 893 0 83 4 0 0 0 0 1562 0 0 0 0 0 44 0 0 0 0 0 4 336 0 0 0 0 0 0 0 0 0 0 7 0 9 0 0 0 0 24 0 1892 0 0 3143 0 0 0
10 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 8 0 0 0
11 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 9 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 95 0 0 0
12 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 2 0 0 13 0 0 0
13 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 0
14 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 51 0 0 0
15 0 52 0 283 75 0 0 0 0 124 0 0 0 0 0 17179 0 8 0 2 0 23 766 0 0 0 0 0 0 1 0 0 0 18 0 1198 0 0 0 0 78 0 288 0 0 7845 0 0 0
16 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 16 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 3 0 0 83 0 0 0
17 0 10 0 63 5 0 0 0 0 23 0 0 0 0 0 156 0 190 0 0 0 3 103 0 0 0 0 0 0 0 0 0 0 7 0 8 0 0 0 0 11 0 67 0 0 3167 0 0 0
18 0 3 0 0 2 0 0 0 1 7 0 0 0 0 0 36 0 1 0 0 0 1 201 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 7 0 23 0 0 587 0 0 0
19 0 2 0 2 1 0 0 0 0 0 0 0 0 0 0 10 0 1 0 0 0 1 29 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 3 0 0 186 0 0 0
20 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 3 0 0 0
21 0 5 0 15 0 0 0 0 0 6 0 0 0 0 0 8 0 0 0 0 0 1 135 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 13 0 14 0 1 564 0 0 0
22 0 80 0 119 20 0 0 0 0 125 0 0 0 0 0 342 0 5 1 1 0 19 16606 0 0 0 0 0 0 0 0 0 0 16 0 36 0 0 0 0 65 0 344 0 0 9067 0 0 0
23 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
24 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
25 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 45 0 0 0
26 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0
27 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 0
28 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 12 0 0 0
29 0 14 0 0 0 0 0 0 0 8 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 22 0 0 43 0 0 0
30 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 3 0 0 0 0 0 0 17 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 4 0 0 63 0 0 0
31 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
32 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 23 0 0 0
33 0 11 0 186 7 0 0 0 0 43 0 0 0 0 0 65 0 7 0 0 0 3 154 0 0 0 0 0 0 0 0 0 0 5 0 3 0 0 0 0 11 0 67 0 0 2618 0 0 0
34 0 2 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 5 0 0 0
35 0 10 0 28 9 0 0 0 0 32 0 0 0 0 0 4867 0 2 0 0 0 4 77 0 0 0 0 0 0 0 0 0 0 3 0 967 0 0 0 0 15 0 16 0 0 1281 0 0 0
36 0 2 0 0 0 0 0 0 0 4 0 0 0 0 0 3 0 0 0 0 0 1 33 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 3 0 4 0 0 115 0 0 0
37 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 9 0 0 0
38 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
39 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 1 0 0 0 0 7 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 77 0 0 0
40 0 27 0 99 5 0 0 0 0 70 0 0 0 0 0 155 0 3 0 0 0 5 567 0 0 0 0 0 0 0 0 0 0 9 0 23 0 0 0 0 33 0 171 0 0 2051 0 0 0
41 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
42 0 917 0 80 9 0 0 0 0 1469 0 0 0 0 0 74 0 4 0 0 0 8 730 0 0 0 0 0 0 2 0 0 0 11 0 8 0 0 0 0 47 0 3306 0 0 4992 0 0 0
43 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 0
44 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 14 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 87 0 0 0
45 0 203 0 239 31 0 0 0 0 248 0 0 0 0 0 1500 0 86 0 0 0 78 1776 0 0 0 0 0 0 1 0 0 0 45 0 157 0 0 0 0 216 0 495 0 1 88558 0 0 1
46 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
47 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 15 0 0 0
48 0 18 0 0 0 0 0 0 0 34 0 0 0 0 0 4 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 49 0 0 76 0 0 0

C-Support Vector Classification

In [111]:
svc = SVC(kernel='rbf',C=3)
svc.fit(XTrain, y1_train)  
y_pred_train = svc.predict(XTrain)
y_pred_test = svc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y1_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y1_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y1_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.989037178265 	 0.729202057778
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
0 6 20 7 0 0 0 0 1 35 8 16 92 158 0 135 7 2 0 12 1 98
1 4 3460 41 0 0 0 0 5 257 9 19 216 373 0 746 82 32 0 18 1 210
2 0 63 2424 0 0 0 0 7 92 9 7 297 200 0 441 78 94 0 2 2 72
3 4 16 0 0 0 0 0 1 0 1 4 8 88 0 28 5 0 0 3 0 10
4 2 8 18 0 0 0 0 1 41 2 3 25 82 0 101 18 7 0 0 0 29
5 0 5 0 0 0 0 0 0 3 4 1 10 20 0 11 1 0 0 0 0 0
6 0 21 4 0 0 0 5 5 12 15 3 49 71 14 278 27 7 0 12 0 14
7 0 38 29 0 0 0 0 2206 92 2 8 33 249 0 1580 132 107 0 8 1 166
8 0 194 96 0 0 0 0 49 5173 6 21 163 1975 0 3769 838 119 0 29 4 342
9 1 23 9 0 0 0 0 7 27 10483 7 78 156 0 660 33 34 0 1 0 26
10 0 32 21 0 0 0 0 0 301 31 484 39 754 0 808 56 11 0 1 2 62
11 3 18 69 0 0 0 0 0 8 12 2 5920 480 0 442 56 11 0 8 3 71
12 2 38 39 0 0 0 0 28 169 14 33 181 24644 2 2385 330 50 0 8 7 60
13 0 32 1 0 0 0 0 1 6 9 9 71 38 92 263 17 3 0 10 0 38
14 4 289 165 0 0 0 0 184 1562 195 54 604 5682 0 50927 1802 681 0 49 20 869
15 0 120 101 0 0 0 0 37 332 8 7 291 1270 0 5041 9489 311 0 49 0 553
16 0 33 73 0 0 0 0 46 123 0 6 21 198 0 1776 312 27136 0 2 1 59
17 0 2 0 0 0 0 0 0 4 0 0 1 11 0 111 4 0 0 0 0 0
18 0 47 19 0 0 0 0 33 210 22 7 118 558 1 1285 583 22 0 2370 0 152
19 0 23 11 0 0 0 0 8 43 25 45 180 509 1 1026 19 58 0 0 465 20
20 1 131 55 0 0 0 0 74 173 12 14 434 363 2 2871 617 111 0 7 0 6242
In [112]:
svc = SVC(kernel='rbf',C=3)
svc.fit(XTrain, y2_train)  
y_pred_train = svc.predict(XTrain)
y_pred_test = svc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y2_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y2_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y2_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.991420400381 	 0.886119626366
0 1 2 3 4 5 6
0 563 0 0 588 4 48 1601
1 2 0 0 54 0 2 51
2 25 0 0 987 4 15 332
3 53 0 0 123796 161 62 2454
4 11 0 0 3403 434 50 1963
5 46 0 1 1627 91 905 6272
6 228 0 0 3040 87 402 58435
In [113]:
svc = SVC(kernel='rbf',C=3)
svc.fit(XTrain, y3_train)  
y_pred_train = svc.predict(XTrain)
y_pred_test = svc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y3_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y3_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y3_test, y_pred=y_pred_test)).to_html()))
0.01 	 50 	 0.981410867493 	 0.660861321386
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
1 0 1634 0 140 0 0 0 0 0 1564 0 0 0 0 0 27 0 2 0 0 0 0 154 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 17 0 2463 0 0 1031 0 0 0
2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0
3 0 88 0 1806 10 0 0 0 0 289 0 0 0 0 0 233 0 23 0 0 0 0 452 0 0 0 0 0 0 0 0 0 0 50 0 7 0 0 0 0 28 0 420 0 0 3790 0 0 0
4 0 3 0 103 398 0 0 0 0 20 0 0 0 0 0 1709 0 7 0 0 0 0 231 0 0 0 0 0 0 0 0 0 0 7 0 198 0 0 0 0 1 0 55 0 0 1508 0 0 0
5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0
6 0 12 0 0 0 0 0 0 0 14 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 38 0 0 9 0 0 0
7 0 0 0 2 0 0 0 0 0 3 0 0 0 0 0 6 0 0 0 0 0 4 28 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 140 0 0 0
8 0 1 0 7 1 0 0 0 0 6 0 0 0 0 0 16 0 0 0 0 0 0 20 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 6 0 0 280 0 0 0
9 0 1253 0 166 2 0 0 0 0 2350 0 0 0 0 0 45 0 3 0 0 0 0 210 0 0 0 0 0 0 0 0 0 0 11 0 2 0 0 0 0 13 0 2600 0 0 1346 0 0 0
10 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 8 0 0 0
11 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 6 0 0 89 0 0 0
12 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 8 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 7 0 0 0
13 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 0
14 0 0 0 0 14 0 0 0 0 0 0 0 0 0 1 20 0 1 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 25 0 0 0
15 0 83 0 923 300 0 0 0 0 233 0 0 0 0 1 17311 0 24 0 0 0 0 766 0 0 0 0 0 0 0 0 0 0 35 0 2371 0 0 0 0 84 0 467 0 0 5342 0 0 0
16 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 3 0 0 0 0 0 1 24 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 74 0 0 0
17 0 7 0 201 5 0 0 0 0 19 0 0 0 0 0 222 0 982 0 0 0 0 86 0 0 0 0 0 0 0 0 0 0 31 0 6 0 0 0 0 5 0 143 0 0 2106 0 0 0
18 0 6 0 10 1 0 0 0 0 15 0 0 0 0 0 48 0 1 0 0 0 0 238 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 29 0 49 0 0 474 0 0 0
19 0 0 0 6 0 0 0 0 0 3 0 0 0 0 0 16 0 0 0 0 0 0 30 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 8 0 0 174 0 0 0
20 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0
21 0 13 0 43 0 0 0 0 0 21 0 0 0 0 0 28 0 2 0 0 0 22 161 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 24 0 38 0 0 408 0 0 0
22 0 199 0 389 10 0 0 0 0 342 0 0 0 0 0 391 0 15 0 0 0 0 19812 0 0 0 0 0 0 0 0 0 0 21 0 32 0 0 0 0 17 0 632 0 0 4986 0 0 0
23 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0
24 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 5 0 0 0
25 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 2 0 0 0 0 0 0 8 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 38 0 0 0
26 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0
27 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 7 0 0 0
28 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 13 0 0 0
29 0 18 0 1 0 0 0 0 0 20 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 37 0 0 15 0 0 0
30 0 0 0 1 0 0 0 0 0 2 0 0 0 0 0 4 0 0 0 0 0 1 9 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 8 0 0 65 0 0 0
31 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
32 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 9 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 18 0 0 0
33 0 13 0 759 7 0 0 0 0 156 0 0 0 0 0 117 0 32 0 0 0 0 160 0 0 0 0 0 0 0 0 0 0 54 0 4 0 0 0 0 4 0 93 0 0 1781 0 0 0
34 0 6 0 0 0 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 7 0 0 2 0 0 0
35 0 8 0 78 18 0 0 0 0 31 0 0 0 0 0 4726 0 0 0 0 0 0 62 0 0 0 0 0 0 0 0 0 0 5 0 1528 0 0 0 0 2 0 41 0 0 812 0 0 0
36 0 1 0 3 0 0 0 0 0 2 0 0 0 0 0 3 0 0 0 0 0 0 27 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 12 0 0 115 0 0 0
37 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 9 0 0 0
38 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
39 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 2 0 0 0 0 9 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 74 0 0 0
40 0 47 0 199 8 0 0 0 0 148 0 0 0 0 0 237 0 10 0 0 0 0 635 0 0 0 0 0 0 0 0 0 0 18 0 27 0 0 0 0 197 0 256 0 0 1436 0 0 0
41 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0
42 0 1525 0 204 3 0 0 0 0 2021 0 0 0 0 0 90 0 5 0 0 0 0 420 0 0 0 0 0 0 0 0 0 0 8 0 4 0 0 0 0 27 0 5280 0 0 2070 0 0 0
43 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 9 0 0 0
44 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 1 0 0 0 2 22 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 75 0 0 0
45 0 343 0 903 50 0 0 0 0 863 0 0 0 0 0 1297 0 369 0 0 0 0 2362 0 0 0 0 0 0 0 0 0 0 31 0 59 0 0 0 0 76 0 1332 0 0 85950 0 0 0
46 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
47 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 15 0 0 0
48 0 28 0 4 0 0 0 0 0 43 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 81 0 0 27 0 0 0

KNeighbors

In [114]:
for n_neighbors in range(2,8):
    knc = KNeighborsClassifier(n_neighbors=n_neighbors)
    gbc.fit(XTrain, y1_train)  
    y_pred_train = gbc.predict(XTrain)
    y_pred_test = gbc.predict(XTest)
    print(n_neighbors,"\t",train_size,"\t",pca_i, "\t",
          metrics.accuracy_score(y_true = y1_train, y_pred = y_pred_train),"\t",
          metrics.accuracy_score(y_true = y1_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y1_test, y_pred=y_pred_test)).to_html()))
2 	 0.01 	 50 	 0.999523355577 	 0.636496195806
3 	 0.01 	 50 	 0.999523355577 	 0.636486571028
4 	 0.01 	 50 	 0.999523355577 	 0.636592443587
5 	 0.01 	 50 	 0.999523355577 	 0.636679066589
6 	 0.01 	 50 	 0.999523355577 	 0.636573194031
7 	 0.01 	 50 	 0.999523355577 	 0.63657800642
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
0 1 9 7 0 0 0 0 2 24 47 4 32 124 0 248 14 5 0 11 0 70
1 0 1403 37 0 0 0 0 7 330 90 35 234 582 1 2386 119 49 0 20 1 179
2 0 28 689 0 0 0 0 4 160 28 11 396 572 1 1500 109 195 0 2 1 92
3 0 1 0 0 0 0 0 1 1 5 1 4 71 0 76 4 0 0 0 0 4
4 0 1 6 0 0 0 0 0 26 5 0 9 61 0 199 9 7 0 0 0 14
5 0 3 0 0 0 0 0 0 1 7 0 6 14 0 23 1 0 0 0 0 0
6 0 3 4 0 0 0 1 6 12 45 6 32 86 0 278 30 7 0 15 0 12
7 0 15 6 0 0 0 0 730 142 3 6 25 355 2 2741 91 362 0 7 0 166
8 0 46 50 0 0 0 0 19 2496 36 7 56 2124 1 6866 628 174 0 24 2 249
9 0 4 2 0 0 0 0 8 30 9314 16 31 319 3 1717 45 23 0 5 4 24
10 0 8 9 0 0 0 0 2 269 89 30 24 892 0 1158 41 24 0 5 1 50
11 0 17 86 0 0 0 0 3 24 134 5 4048 674 3 1732 149 8 0 16 1 203
12 0 10 16 0 0 0 0 10 80 27 12 50 22393 5 5026 219 65 0 8 7 62
13 0 3 0 0 0 0 0 0 9 82 6 63 33 1 338 21 1 0 2 0 31
14 1 73 51 0 0 0 0 108 719 304 39 214 4633 16 53873 1105 1363 0 62 30 496
15 0 27 21 0 0 0 0 12 154 12 6 60 1286 1 9179 5780 634 0 41 2 394
16 0 11 9 0 0 0 0 11 81 0 11 4 206 3 2603 232 26565 0 10 1 39
17 0 1 0 0 0 0 0 0 3 0 0 0 11 0 114 1 1 0 0 0 2
18 1 11 4 0 0 0 0 10 247 90 9 60 769 0 2660 688 23 0 656 1 198
19 0 7 15 0 0 0 0 3 44 71 14 82 426 2 1620 24 68 0 3 43 11
20 0 106 23 0 0 0 0 46 130 48 5 199 303 5 5060 660 251 0 13 2 4256
In [115]:
for n_neighbors in range(2,8):
    knc = KNeighborsClassifier(n_neighbors=n_neighbors)
    gbc.fit(XTrain, y2_train)  
    y_pred_train = gbc.predict(XTrain)
    y_pred_test = gbc.predict(XTest)
    print(n_neighbors,"\t",train_size,"\t",pca_i, "\t",
          metrics.accuracy_score(y_true = y2_train, y_pred = y_pred_train),"\t",
          metrics.accuracy_score(y_true = y2_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y2_test, y_pred=y_pred_test)).to_html()))
2 	 0.01 	 50 	 0.99714013346 	 0.870397551456
3 	 0.01 	 50 	 0.99714013346 	 0.870286866509
4 	 0.01 	 50 	 0.99714013346 	 0.870373489511
5 	 0.01 	 50 	 0.99714013346 	 0.87033017801
6 	 0.01 	 50 	 0.99714013346 	 0.870373489511
7 	 0.01 	 50 	 0.99714013346 	 0.870368677122
0 1 2 3 4 5 6
0 104 0 28 729 0 121 1822
1 5 0 0 61 0 2 41
2 7 0 3 992 2 17 342
3 38 0 33 123050 61 96 3248
4 2 0 5 2876 115 35 2828
5 38 0 15 1604 18 328 6939
6 77 0 45 4572 22 216 57260
In [116]:
for n_neighbors in range(2,8):
    knc = KNeighborsClassifier(n_neighbors=n_neighbors)
    gbc.fit(XTrain, y3_train)  
    y_pred_train = gbc.predict(XTrain)
    y_pred_test = gbc.predict(XTest)
    print(n_neighbors,"\t",train_size,"\t",pca_i, "\t",
          metrics.accuracy_score(y_true = y3_train, y_pred = y_pred_train),"\t",
          metrics.accuracy_score(y_true = y3_test, y_pred = y_pred_test))
display(HTML(pd.DataFrame(metrics.confusion_matrix(y_true=y3_test, y_pred=y_pred_test)).to_html()))
2 	 0.01 	 50 	 0.99714013346 	 0.624267915321
3 	 0.01 	 50 	 0.996663489037 	 0.625856003696
4 	 0.01 	 50 	 0.996663489037 	 0.62598112581
5 	 0.01 	 50 	 0.99714013346 	 0.624383412658
6 	 0.01 	 50 	 0.996663489037 	 0.625990750588
7 	 0.01 	 50 	 0.99714013346 	 0.624195729486
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
1 0 891 0 69 7 0 0 0 0 1269 0 0 0 0 0 32 0 2 0 0 0 5 333 0 0 0 0 0 0 0 0 0 0 10 0 4 0 0 0 0 24 0 1859 0 0 2528 0 0 0
2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0
3 0 53 0 435 11 0 0 0 0 120 0 0 0 0 0 157 0 10 0 0 0 10 435 0 0 0 0 0 0 0 0 0 0 14 0 9 0 0 0 0 23 0 227 0 0 5692 0 0 0
4 0 4 0 34 44 0 0 0 0 6 0 0 0 0 0 1457 0 3 0 0 0 12 256 0 0 0 0 0 0 0 0 0 0 2 0 76 0 0 0 0 22 0 57 0 0 2267 0 0 0
5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0
6 0 7 0 0 0 0 0 0 0 9 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 25 0 0 31 0 0 0
7 0 3 0 1 1 0 0 0 0 3 0 0 0 0 0 1 0 0 1 0 0 0 40 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 3 0 0 134 0 0 0
8 0 1 0 4 0 0 0 0 0 0 0 0 0 0 0 9 0 2 0 0 0 0 12 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 2 0 1 0 0 306 0 0 0
9 0 900 0 82 6 0 0 0 0 1554 0 0 0 0 0 46 0 1 0 0 0 8 332 0 0 0 0 0 0 0 0 0 0 10 0 9 0 0 0 0 32 0 1879 0 0 3142 0 0 0
10 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 8 0 0 0
11 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 9 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 94 0 0 0
12 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 2 0 0 13 0 0 0
13 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 0
14 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 9 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 52 0 0 0
15 0 51 0 286 71 0 0 0 0 121 0 0 0 0 0 17143 0 8 0 1 0 22 768 0 0 0 0 0 0 0 0 0 0 18 0 1253 0 0 0 0 68 0 287 0 0 7843 0 0 0
16 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 16 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 3 0 0 83 0 0 0
17 0 8 0 63 3 0 0 0 0 24 0 0 0 0 0 155 0 184 0 0 0 3 103 0 0 0 0 0 0 0 0 0 0 6 0 8 0 0 0 0 13 0 68 0 0 3175 0 0 0
18 0 1 0 0 1 0 0 0 0 7 0 0 0 0 0 35 0 1 0 0 0 1 205 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 11 0 22 0 0 585 0 0 0
19 0 1 0 2 1 0 0 0 0 0 0 0 0 0 0 10 0 1 0 0 0 2 28 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 0 2 0 0 185 0 0 0
20 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 3 0 0 0
21 0 5 0 15 0 0 0 0 0 6 0 0 0 0 0 9 0 0 0 0 0 1 135 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 8 0 16 0 0 567 0 0 0
22 0 81 0 121 20 0 0 0 0 126 0 0 0 0 0 345 0 5 0 0 0 18 16609 0 0 0 0 0 0 0 0 0 0 15 0 37 0 0 0 0 63 0 342 0 0 9064 0 0 0
23 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
24 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
25 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 45 0 0 0
26 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0
27 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 0
28 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 12 0 0 0
29 0 14 0 0 0 0 0 0 0 8 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 22 0 0 43 0 0 0
30 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 3 0 0 0 0 0 0 16 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 4 0 0 64 0 0 0
31 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
32 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 23 0 0 0
33 0 10 0 190 7 0 0 0 0 43 0 0 0 0 0 68 0 7 0 0 0 4 152 0 0 0 0 0 0 0 0 0 0 5 0 3 0 0 0 0 9 0 68 0 0 2614 0 0 0
34 0 2 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 5 0 0 0
35 0 10 0 28 7 0 0 0 0 31 0 0 0 0 0 4857 0 3 0 0 0 5 77 0 0 0 0 0 0 0 0 0 0 3 0 978 0 0 0 0 14 0 16 0 0 1282 0 0 0
36 0 2 0 0 0 0 0 0 0 4 0 0 0 0 0 4 0 0 0 0 0 1 33 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 5 0 4 0 0 112 0 0 0
37 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 8 0 0 0
38 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0
39 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 1 0 0 0 0 7 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 77 0 0 0
40 0 28 0 101 5 0 0 0 0 71 0 0 0 0 0 154 0 3 0 0 0 2 575 0 0 0 0 0 0 0 0 0 0 8 0 22 0 0 0 0 26 0 170 0 0 2053 0 0 0
41 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
42 0 927 0 82 9 0 0 0 0 1460 0 0 0 0 0 75 0 3 0 0 0 9 735 0 0 0 0 0 0 1 0 0 0 11 0 9 0 0 0 0 43 0 3297 0 0 4996 0 0 0
43 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 0 0 0
44 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 14 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 87 0 0 0
45 0 202 0 239 33 0 0 0 0 248 0 0 0 0 0 1493 0 91 1 0 0 102 1786 0 0 0 0 0 0 1 0 0 0 43 0 159 0 0 0 0 205 0 492 0 0 88539 0 0 1
46 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
47 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 3 0 0 14 0 0 0
48 0 18 0 0 0 0 0 0 0 34 0 0 0 0 0 4 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 49 0 0 76 0 0 0

Create interactive chart

Must rerun for interactive chart!

In [7]:
%%javascript
// Since I append the div later, sometimes there are multiple divs.
$("#container3").remove();

// Make the cdiv to contain the chart.
element.append('<div id="container3" style="min-width: 310px; height: 400px; margin: 0 auto"></div>');

// Require highcarts and make the chart.
require(['highcharts_exports'], function(Highcharts) {
    $('#container3').highcharts({
         title: {
        text: 'Classification'
    },
        plotOptions: {
            scatter: {
                dataLabels: {
                    format: "{point.name}",
                    enabled: true
                },
                
                enableMouseTracking: false
            }
        },
        
        yAxis: {
        title: {
            text: 'test'
        }
    },xAxis: {
        title: {
            text: 'train'
        }
    },
       
        legend: {
            enabled: false
        },
        series: [{name:'Taster - MLPClassifier',data:[[1,0.69877332204]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'Category - MLPClassifier',data:[[1,0.858154833804]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'Country - MLPClassifier',data:[[1,0.582068076055]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'Taster - BaggingClassifier',data:[[0.993326978074,0.569632862842]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'Category - BaggingClassifier',data:[[0.988083889418,0.850993998951]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'Country - BaggingClassifier',data:[[0.990943755958,0.597799775743]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'Taster - AdaBoostClassifier',data:[[0.370829361296,0.379793741007]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'Category - AdaBoostClassifier',data:[[0.780266920877,0.790694764602]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'Country - AdaBoostClassifier',data:[[0.491420400381,0.500733889325]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'Taster - DecisionTree',data:[[0.60962821735,0.512894796364]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'Category - DecisionTree',data:[[0.902764537655,0.833905205561]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'Country - DecisionTree',data:[[0.60819828408,0.56407936592]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'Taster - RandomForestClassifier',data:[[0.992850333651,0.551648965096]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'Category - RandomForestClassifier',data:[[0.988560533842,0.832899416257]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'Country - RandomForestClassifier',data:[[0.994280266921,0.580686920408]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'Taster - ExtraTreesClassifier',data:[[1,0.51779380838]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'Category - ExtraTreesClassifier',data:[[1,0.814010789376]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'Country - ExtraTreesClassifier',data:[[1,0.561268930735]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'Taster - GradientBoostingClassifier',data:[[0.999523355577,0.636462509083]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'Category - GradientBoostingClassifier',data:[[0.99714013346,0.870224305452]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'Country - GradientBoostingClassifier',data:[[0.995710200191,0.624893525893]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'Taster - C-Support Vector Classification',data:[[0.989037178265,0.729202057778]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'Category - C-Support Vector Classification',data:[[0.991420400381,0.886119626366]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'Country - C-Support Vector Classification',data:[[0.981410867493,0.660861321386]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}},
{name:'Taster - KNeighbors',data:[[0.999523355577,0.636597255976]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'square'}},
{name:'Category - KNeighbors',data:[[0.99714013346,0.870397551456]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'diamond'}},
{name:'Country - KNeighbors',data:[[0.996663489037,0.62598112581]],marker:{fillColor:'rgba(255,0,0,.5)',symbol:'triangle'}}]
    });
});

SVM again is the best across the board.

Cross Validation & VotingClassifier

In [110]:
train_size=.1
pca_i=50
output_data = return_model_data(train_size=train_size, pca_i=pca_i, input_df=df2, keep_vars=['points','price_per_liter'])
XTrain, XTest, y1_train, y1_test, y2_train, y2_test, y3_train, y3_test,lookupTable1,lookupTable2,lookupTable3,vectorizer,normalizer1,svd1,scaler = output_data
In [111]:
mlpc = MLPClassifier(solver='lbfgs', alpha=1e-5, random_state=41)
bgc = BaggingClassifier(random_state=41)
abc = AdaBoostClassifier(random_state=41)
dtc = DecisionTreeClassifier(max_depth=13,random_state=41)
rfc = RandomForestClassifier( random_state=41)
etc = ExtraTreesClassifier( random_state=41)
gbc = GradientBoostingClassifier(random_state=41)
svc = SVC(kernel='rbf',C=3)
knc = KNeighborsClassifier(n_neighbors=5)

eclf = VotingClassifier(estimators=[('mlpc', mlpc),
                                    ('bgc', bgc),
                                    ('abc', abc),
                                    ('dtc', dtc),
                                    ('rfc', rfc),
                                    ('etc', etc),
                                    ('gbc', gbc),
                                    ('svc', svc),
                                    ('knc', knc)], voting='hard')

list_of_models = [mlpc,bgc,abc,dtc,rfc,etc,gbc,svc,knc,eclf ]
list_of_models_names = ['MLP', 'Bagging', 'AdaBoostr', 'DecisionTree',
                        'RandomForest', 'ExtraTrees', 'GradientBoosting', 'C-Support Vector Classification',
                        'KNeighbors','VotingEnsemble']
for clf, label in zip(list_of_models, list_of_models_names):
    scores = cross_val_score(clf, XTrain, y1_train, cv=3, scoring='accuracy')
    print("Accuracy: %0.2f (+/- %0.2f) [%s] %s" % (scores.mean(), scores.std(), label, "y1"))
    #scores = cross_val_score(clf, XTrain, y2_train, cv=3, scoring='accuracy')
    #print("Accuracy: %0.2f (+/- %0.2f) [%s] %s" % (scores.mean(), scores.std(), label, "y2"))
    #scores = cross_val_score(clf, XTrain, y3_train, cv=3, scoring='accuracy')
    #print("Accuracy: %0.2f (+/- %0.2f) [%s] %s" % (scores.mean(), scores.std(), label, "y3"))
Accuracy: 0.74 (+/- 0.00) [MLP] y1
Accuracy: 0.63 (+/- 0.00) [Bagging] y1
Accuracy: 0.42 (+/- 0.04) [AdaBoostr] y1
Accuracy: 0.53 (+/- 0.00) [DecisionTree] y1
Accuracy: 0.61 (+/- 0.00) [RandomForest] y1
Accuracy: 0.59 (+/- 0.01) [ExtraTrees] y1
Accuracy: 0.72 (+/- 0.00) [GradientBoosting] y1
Accuracy: 0.80 (+/- 0.00) [C-Support Vector Classification] y1
Accuracy: 0.62 (+/- 0.01) [KNeighbors] y1
Accuracy: 0.73 (+/- 0.01) [VotingEnsemble] y1
In [116]:
mlpc = MLPClassifier(solver='lbfgs', alpha=1e-5, random_state=41)
gbc = GradientBoostingClassifier(random_state=41)
svc = SVC(kernel='rbf',C=3)

eclf = VotingClassifier(estimators=[('mlpc', mlpc),
                                    ('gbc', gbc),
                                    ('svc', svc)], voting='hard')
scores = cross_val_score(eclf, XTrain, y1_train, cv=3, scoring='accuracy')
print("Accuracy: %0.2f (+/- %0.2f) [%s] %s" % (scores.mean(), scores.std(), label, "y1"))
Accuracy: 0.79 (+/- 0.00) [VotingEnsemble] y1

C-Support Vector Classification

In [112]:
svc = SVC(kernel='rbf',C=3)
svc.fit(XTrain, y1_train)  
y_pred_train = svc.predict(XTrain)
y_pred_test = svc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y1_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y1_test, y_pred = y_pred_test))
0.1 	 50 	 0.959597884606 	 0.810339533948
In [113]:
svc = SVC(kernel='rbf',C=3)
svc.fit(XTrain, y2_train)  
y_pred_train = svc.predict(XTrain)
y_pred_test = svc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y2_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y2_test, y_pred = y_pred_test))
0.1 	 50 	 0.974129305827 	 0.905715011699
In [114]:
svc = SVC(kernel='rbf',C=3)
svc.fit(XTrain, y3_train)  
y_pred_train = svc.predict(XTrain)
y_pred_test = svc.predict(XTest)
print(train_size,"\t",pca_i, "\t",
      metrics.accuracy_score(y_true = y3_train, y_pred = y_pred_train),"\t",
      metrics.accuracy_score(y_true = y3_test, y_pred = y_pred_test))
0.1 	 50 	 0.926580589833 	 0.708002922088

Custom Neural Network

In [73]:
class neuralNetwork:
    def __init__(self, input_nodes, hidden_nodes, output_nodes, learning_rate, lookupTable1, lookupTable2,
                 lookupTable3,weights_input_to_hidden,weights_hidden_to_output):
        self.input_nodes = input_nodes
        self.hidden_nodes = hidden_nodes
        self.output_nodes = output_nodes
        self.weights_input_to_hidden = weights_input_to_hidden
        self.weights_hidden_to_output = weights_hidden_to_output
        self.learning_rate = learning_rate
        self.lookupTable1 = lookupTable1
        self.lookupTable2 = lookupTable2
        self.lookupTable3 = lookupTable3
        self.y1_n = len(lookupTable1)
        self.y2_n = len(lookupTable2)
        self.y3_n = len(lookupTable3)
        self.e = 0
        pass
    
    def pickle(self):
        joblib.dump(self.weights_input_to_hidden, 'weights_input_to_hidden'+str(self.e)+'.pkl')
        joblib.dump(self.weights_hidden_to_output, 'weights_hidden_to_output'+str(self.e)+'.pkl')
        joblib.dump(self.lookupTable1, 'lookupTable1'+str(self.e)+'.pkl')
        joblib.dump(self.lookupTable2, 'lookupTable2'+str(self.e)+'.pkl')
        joblib.dump(self.lookupTable3, 'lookupTable3'+str(self.e)+'.pkl')
        joblib.dump(self.input_nodes, 'input_nodes'+str(self.e)+'.pkl')
        joblib.dump(self.hidden_nodes, 'hidden_nodes'+str(self.e)+'.pkl')
        joblib.dump(self.output_nodes, 'output_nodes'+str(self.e)+'.pkl')
        pass
    
    def activation_function(self,x):
        return sc.special.expit(x)
    
    def inverse_activation_function(self,x):
        return sc.special.logit(x)
    
    def get_e(self):
        return self.e
    
    def get_lookup_Tables(self):
        return (self.lookupTable1, self.lookupTable2, self.lookupTable3)
    
    def train(self, inputs_list, targets_list):
        inputs = np.array(inputs_list, ndmin=2).T
        targets = np.array(targets_list, ndmin=2).T
        hidden_inputs = np.dot(self.weights_input_to_hidden, inputs)
        hidden_outputs = self.activation_function(hidden_inputs)
        final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs)
        final_outputs = self.activation_function(final_inputs)
        output_errors = targets - final_outputs
        hidden_errors = np.dot(self.weights_hidden_to_output.T, output_errors)
        self.weights_hidden_to_output += self.learning_rate * np.dot(
            (output_errors * final_outputs * (1.0 - final_outputs)), np.transpose(hidden_outputs))
        self.weights_input_to_hidden += self.learning_rate * np.dot(
            (hidden_errors * hidden_outputs * (1.0 - hidden_outputs)), np.transpose(inputs))
        self.e += 1
        pass

    def train_df(self, XTrain, y1_train, y2_train, y3_train):
        for x, y1, y2, y3 in zip(XTrain, y1_train, y2_train, y3_train):
            inputs = np.asfarray(x)
            targets = np.zeros(self.output_nodes) + 0.01
            targets[y1] = .99
            targets[self.y1_n + y2] = .99
            targets[self.y1_n + self.y2_n + y3] = .99
            self.train(inputs, targets)
            pass
        pass

    def query(self, inputs_list):
        inputs = np.array(inputs_list, ndmin=2).T
        hidden_inputs = np.dot(self.weights_input_to_hidden, inputs)
        hidden_outputs = self.activation_function(hidden_inputs)
        final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs)
        final_outputs = self.activation_function(final_inputs)
        return final_outputs

    def back_query(self, targets_list):
        final_outputs = np.array(targets_list, ndmin=2).T
        final_inputs = self.inverse_activation_function(final_outputs)
        hidden_outputs = np.dot(self.weights_hidden_to_output.T, final_inputs)
        hidden_outputs -= np.min(hidden_outputs)
        hidden_outputs /= np.max(hidden_outputs)
        hidden_outputs *= 0.98
        hidden_outputs += 0.01
        hidden_inputs = self.inverse_activation_function(hidden_outputs)
        inputs = np.dot(self.weights_input_to_hidden.T, hidden_inputs)
        return inputs

    def get_target(self, country, category, taster):
        targets = np.zeros(self.output_nodes) + 0.01
        targets[self.lookupTable1.tolist().index(taster)] = .99
        targets[self.y1_n + self.lookupTable2.tolist().index(category)] = .99
        targets[self.y1_n + self.y2_n + self.lookupTable3.tolist().index(country)] = .99
        return targets
    
    def create_word_cloud(self, plot_cloud, cuttoff, abs_pass, country, category, taster, vectorizer, scaler, svd, map_mask_path):
        my_input = self.back_query(self.get_target(country, category, taster))
        my_input2 = scaler.inverse_transform([x[0] for x in my_input])
        my_input3 = svd.inverse_transform(my_input2[2:].reshape(1, -1))[0]
        my_features = vectorizer.get_feature_names()
        
        if abs_pass:
            high_indexes = np.where(np.abs(my_input3)>cuttoff)
        else:
            high_indexes = np.where(my_input3>cuttoff)
            
        my_dic ={}
        for x in high_indexes[0]:
            my_dic[my_features[x]]= math.floor(my_input3[x]*1000)
        
        if plot_cloud:
            map_mask = np.array(Image.open(map_mask_path))
            wc = WordCloud(background_color="white", max_words=2000, mask=map_mask)
            wc.generate_from_frequencies(my_dic)
            plt.imshow(wc, interpolation='bilinear')
            plt.axis("off")
            plt.show()
        return my_dic

    def get_accuracy(self, XTrain, y1_train, y2_train, y3_train):
        count_right1 = 0
        count_right2 = 0
        count_right3 = 0
        count_total = 0
        for x, y1, y2, y3 in zip(XTrain, y1_train, y2_train, y3_train):
            inputs = np.asfarray(x)
            predicted_value = self.query(inputs)
            y_1_prediction = np.argmax([predicted_value[i][0] for i in range(0, self.y1_n)])
            y_2_prediction = np.argmax([predicted_value[i][0] for i in range(self.y1_n, self.y1_n + self.y2_n)])
            y_3_prediction = np.argmax(
                [predicted_value[i][0] for i in range(self.y1_n + self.y2_n, self.y1_n + self.y2_n + self.y3_n)])
            count_total += 1
            if y1 == y_1_prediction:
                count_right1 += 1
            if y2 == y_2_prediction:
                count_right2 += 1
            if y3 == y_3_prediction:
                count_right3 += 1
        print(self.hidden_nodes, self.e, count_total, count_right1 / count_total, count_right2 / count_total,
              count_right3 / count_total, count_total, sep='\t')
        return (self.hidden_nodes, self.e, count_total, count_right1 / count_total, count_right2 / count_total,
                count_right3 / count_total, count_total)
    
    def labeled_query(self, x):
        inputs = np.asfarray(x)
        predicted_value = self.query(inputs)
        y_1_prediction = np.argmax([predicted_value[i][0] for i in range(0, self.y1_n)])
        y_2_prediction = np.argmax([predicted_value[i][0] for i in range(self.y1_n, self.y1_n + self.y2_n)])
        y_3_prediction = np.argmax( [predicted_value[i][0] for i in range(self.y1_n + self.y2_n, self.y1_n + self.y2_n + self.y3_n)])
        return(self.lookupTable1[y_1_prediction],
               self.lookupTable2[y_2_prediction],
               self.lookupTable3[y_3_prediction])
In [64]:
train_size=.5
pca_i=100
output_data = return_model_data(train_size=train_size, pca_i=pca_i, input_df=df2, keep_vars=['points','price_per_liter'], save_tools = True)
XTrain, XTest, y1_train, y1_test, y2_train, y2_test, y3_train, y3_test, lookupTable1, lookupTable2, lookupTable3, vectorizer, normalizer1, svd1, scaler = output_data
In [74]:
y1_n= len(lookupTable1)
y2_n= len(lookupTable2)
y3_n= len(lookupTable3)

# number of input, hidden and output nodes
input_nodes = XTrain.shape[1]
hidden_nodes = 100
output_nodes = y1_n + y2_n + y3_n

# learning rate
learning_rate = 0.001
weights_input_to_hidden = np.random.normal(0.0, pow(input_nodes, -0.5),
                                                        (hidden_nodes, input_nodes))
weights_hidden_to_output = np.random.normal(0.0, pow(hidden_nodes, -0.5),
                                                         (output_nodes, hidden_nodes))

n = neuralNetwork(input_nodes, hidden_nodes, output_nodes, learning_rate, 
                  lookupTable1, lookupTable2, lookupTable3,
                 weights_input_to_hidden=weights_input_to_hidden,
                  weights_hidden_to_output=weights_hidden_to_output)
In [72]:
n.pickle()
In [75]:
epochs = 1000
for e in range(epochs):
    XTrain, y1_train, y2_train, y3_train = shuffle(XTrain, y1_train,y2_train,y3_train)
    n.train_df(XTrain,y1_train, y2_train, y3_train)
    n.get_accuracy(XTrain, y1_train, y2_train, y3_train)
    n.get_accuracy(XTest, y1_test, y2_test, y3_test)
    n.pickle()
    pass
100	104947	104947	0.6894718286373122	0.887467007155993	0.6630584961933166	104947
100	104947	104948	0.6856538476197735	0.8878682776232039	0.6635571902275412	104948
100	209894	104947	0.7654339809618188	0.8897348185274472	0.6752932432561197	104947
100	209894	104948	0.7603289248008538	0.8903838091245188	0.6745340549605519	104948
100	314841	104947	0.7889696704050616	0.8950136735685632	0.6825826369500796	104947
100	314841	104948	0.7832545641651103	0.895329115371422	0.6815565804017227	104948
100	419788	104947	0.7985459327088912	0.8991681515431599	0.6870992024545723	104947
100	419788	104948	0.7928879063917369	0.8991119411518085	0.6854823341083203	104948
100	524735	104947	0.8069215889925391	0.9022173096896529	0.6904913908925457	104947
100	524735	104948	0.8010443267141822	0.9021515417158974	0.6886458055417921	104948
100	629682	104947	0.8117049558348499	0.9040563331967565	0.6935596062774544	104947
100	629682	104948	0.8058085909212181	0.9039905476998132	0.691685406105881	104948
100	734629	104947	0.8169552250183426	0.9053045823129771	0.6965039496126616	104947
100	734629	104948	0.8103918130883866	0.9052006708084004	0.6941723520219537	104948
100	839576	104947	0.8194612518699915	0.9068386900054313	0.6977521987288822	104947
100	839576	104948	0.8130121584022564	0.9066108930136829	0.6957445592102756	104948
100	944523	104947	0.8227295682582637	0.9082298684097687	0.7001438821500376	104947
100	944523	104948	0.8159278880969623	0.9073541182299806	0.6976216793078477	104948
100	1049470	104947	0.8250736085833802	0.909039801042431	0.7021163063260503	104947
100	1049470	104948	0.8180241643480581	0.9082783854861455	0.6994606852917635	104948
100	1154417	104947	0.8266267735142501	0.9097353902445997	0.7019733770379334	104947
100	1154417	104948	0.819215230399817	0.9087738689636773	0.7003658954911004	104948
100	1259364	104947	0.8277416219615615	0.9101355922513269	0.7055847237176861	104947
100	1259364	104948	0.8201585547128102	0.9091550101002401	0.7028814269924153	104948
100	1364311	104947	0.8297998037104444	0.9110979827913137	0.7047938483234395	104947
100	1364311	104948	0.8211114075542173	0.9102412623394442	0.702576514083165	104948
100	1469258	104947	0.830590679104691	0.911221854841015	0.7090245552516985	104947
100	1469258	104948	0.8226169150436407	0.9106319320044213	0.7053493158516598	104948
100	1574205	104947	0.8313434400221064	0.9117459288974434	0.7095200434505036	104947
100	1574205	104948	0.8230838129359301	0.9112322292945078	0.7055208293631131	104948
100	1679152	104947	0.832229601608431	0.9122795315730797	0.7104729053712826	104947
100	1679152	104948	0.8240938369478218	0.9116133704310706	0.7066642527728018	104948
100	1784099	104947	0.8330585914795087	0.912632090483768	0.7115782251993864	104947
100	1784099	104948	0.8253230171132371	0.911661013073141	0.7079982467507718	104948
100	1889046	104947	0.8343449550725605	0.9134896662124692	0.7128645887924381	104947
100	1889046	104948	0.8254659450394481	0.9124899950451653	0.7080649464496703	104948
100	1993993	104947	0.8348499718905733	0.9136516527390016	0.7129503463653082	104947
100	1993993	104948	0.825847086176011	0.912585280329306	0.708427030529405	104948
100	2098940	104947	0.8358504769073913	0.9136135382621704	0.7153229725480481	104947
100	2098940	104948	0.8272001372108092	0.9130235926363532	0.7103708503258757	104948
100	2203887	104947	0.8361935071988719	0.9141185550801834	0.7154944876937883	104947
100	2203887	104948	0.8273430651370203	0.9134428478865724	0.7101040515302817	104948
100	2308834	104947	0.8374512849343002	0.9147474439478975	0.7160757334654635	104947
100	2308834	104948	0.8284102603193962	0.9138239890231352	0.7111426611274155	104948
100	2413781	104947	0.8375846856032092	0.9149284877128455	0.7165902789026842	104947
100	2413781	104948	0.8283340320920837	0.9135190761138849	0.7119144719289553	104948
100	2518728	104947	0.8380230020867676	0.9149951880473001	0.718286373121671	104947
100	2518728	104948	0.828772344399131	0.91392880283569	0.7121907992529634	104948
100	2623675	104947	0.8389949212459622	0.9156050196765987	0.7178956997341516	104947
100	2623675	104948	0.829572740785913	0.9144814574837062	0.7125052406906277	104948
100	2728622	104947	0.8399096686899101	0.9158813496336246	0.7198681239101642	104947
100	2728622	104948	0.8303826657011091	0.9148625986202691	0.7140202767084651	104948
100	2833569	104947	0.8396905104481309	0.9148427301399754	0.7192964067576968	104947
100	2833569	104948	0.830373137172695	0.9138335175515493	0.7134580935320349	104948
100	2938516	104947	0.841415190524741	0.9159861644449103	0.7203064403937225	104947
100	2938516	104948	0.8314403323550711	0.9148340130350269	0.713601021458246	104948
100	3043463	104947	0.8416057629088969	0.9159385213488713	0.7196775515260083	104947
100	3043463	104948	0.8323074284407516	0.9144719289552922	0.7140202767084651	104948
100	3148410	104947	0.8425967393065071	0.9161957940674816	0.719925295625411	104947
100	3148410	104948	0.8321454434577124	0.9151960971147616	0.7134866791172771	104948
100	3253357	104947	0.8428921265019486	0.9165959960742089	0.7225456659075533	104947
100	3253357	104948	0.832478941952205	0.9151103403590349	0.7152303998170523	104948
100	3358304	104947	0.843606772942533	0.9169390263656894	0.7231936120136832	104947
100	3358304	104948	0.8334317947936121	0.9154628959103556	0.7159545679765217	104948
100	3463251	104947	0.843282799889468	0.9173964000876633	0.7230792685831896	104947
100	3463251	104948	0.8325837557647597	0.9157487517627777	0.7159069253344513	104948
100	3568198	104947	0.8441213183797536	0.9173296997532088	0.7238606153582284	104947
100	3568198	104948	0.8340320920836986	0.9158154514616763	0.7162880664710142	104948
100	3673145	104947	0.8446644496745976	0.9175583866141958	0.724651490752475	104947
100	3673145	104948	0.8334794374356824	0.9159869649731296	0.7158402256355528	104948
100	3778092	104947	0.844693035532221	0.9175393293757802	0.7243179890802024	104947
100	3778092	104948	0.8340606776689408	0.9156629950070511	0.7165834508518505	104948
100	3883039	104947	0.8449788941084547	0.9175202721373645	0.7247277197061374	104947
100	3883039	104948	0.8340416206121126	0.9158630941037467	0.7165453367381942	104948
100	3987986	104947	0.8454743823072599	0.9182635044355723	0.7254804806235529	104947
100	3987986	104948	0.8350040019819339	0.9163299919960362	0.7162023097152875	104948
100	4092933	104947	0.8458269412179481	0.9182635044355723	0.725156507570488	104947
100	4092933	104948	0.8346419179021992	0.9162728208255517	0.7166787361359912	104948
100	4197880	104947	0.8465987593737792	0.9184064337236891	0.7256615243885008	104947
100	4197880	104948	0.8359092121812707	0.9165586766779739	0.7173743187102184	104948
100	4302827	104947	0.8466845169466493	0.918682763680715	0.7259759688223579	104947
100	4302827	104948	0.8354994854594656	0.9166634904905286	0.7173933757670465	104948
100	4407774	104947	0.8475516212945582	0.9192925953100136	0.7265095714979942	104947
100	4407774	104948	0.8357091130845752	0.9169588748713648	0.717317147539734	104948
100	4512721	104947	0.8479518233012854	0.919140137402689	0.7258997398686956	104947
100	4512721	104948	0.8359282692380988	0.9171399169112322	0.7172123337271792	104948
100	4617668	104947	0.8473229344335712	0.9190257939721955	0.7271765748425396	104947
100	4617668	104948	0.8353470290048405	0.9167301901894271	0.7178126310172657	104948
100	4722615	104947	0.8483996684040516	0.9190734370682344	0.7263094704946306	104947
100	4722615	104948	0.8365952662270839	0.9167873613599116	0.7168311925906163	104948
100	4827562	104947	0.8485425976921684	0.9193878815020915	0.7280055647136173	104947
100	4827562	104948	0.8372336776308267	0.9170351030986774	0.7178126310172657	104948
100	4932509	104947	0.8490762003678047	0.9197785548896109	0.7286249249621237	104947
100	4932509	104948	0.8373766055570377	0.9174353012920684	0.7179460304150627	104948
100	5037456	104947	0.8484568401192983	0.9194259959789227	0.727976978855994	104947
100	5037456	104948	0.8366524373975683	0.9170827457407478	0.717460075465945	104948
100	5142403	104947	0.8494859309937397	0.9193307097868448	0.7287583256310328	104947
100	5142403	104948	0.8369954644204749	0.9171780310248885	0.7182033006822426	104948
100	5247350	104947	0.8494573451361164	0.9197118545551564	0.7296444872173573	104947
100	5247350	104948	0.8369764073636468	0.9174734154057247	0.7189655829553684	104948
100	5352297	104947	0.8497813181891812	0.9194450532173383	0.7290918273033055	104947
100	5352297	104948	0.8370526355909593	0.9173114304226855	0.7183176430232115	104948
100	5457244	104947	0.8500290622885838	0.9200358276082213	0.7289488980151886	104947
100	5457244	104948	0.8374814193695925	0.9178164424286314	0.7184510424210085	104948
100	5562191	104947	0.8504387929145187	0.9201596996579225	0.7297302447902274	104947
100	5562191	104948	0.8375290620116629	0.9180165415253269	0.7185939703472196	104948
100	5667138	104947	0.8498861330004669	0.9202168713731693	0.7296826016941885	104947
100	5667138	104948	0.8374528337843503	0.917521058047795	0.7185367991767352	104948
100	5772085	104947	0.8506484225370902	0.9199405414161433	0.7286344535813315	104947
100	5772085	104948	0.8368239509090216	0.9174067157068262	0.717860273659336	104948
100	5877032	104947	0.8507532373483758	0.9207123595719744	0.730387719515565	104947
100	5877032	104948	0.8374909478980066	0.917768799786561	0.7183557571368678	104948
100	5981979	104947	0.851839499938064	0.9204836727109874	0.7307307498070454	104947
100	5981979	104948	0.837681518466288	0.9174543583488965	0.7190608682395091	104948
100	6086926	104947	0.8514583551697523	0.9207028309527666	0.7308165073799155	104947
100	6086926	104948	0.8379959599039525	0.9177116286160766	0.7196706940580097	104948
100	6191873	104947	0.8517918568420251	0.9208552888600913	0.7312357666250584	104947
100	6191873	104948	0.8374718908411785	0.9179212562411861	0.7192419102793765	104948
100	6296820	104947	0.8519729006069731	0.9207790599064289	0.7313786959131752	104947
100	6296820	104948	0.83740519114228	0.9175496436330373	0.7195754087738689	104948
100	6401767	104947	0.8520396009414276	0.9209505750521692	0.7319122985888115	104947
100	6401767	104948	0.8376529328810458	0.9182166406220224	0.7192800243930327	104948
100	6506714	104947	0.8520586581798432	0.9209696322905848	0.7312262380058506	104947
100	6506714	104948	0.8372241491024126	0.9182071120936083	0.7195372946602127	104948
100	6611661	104947	0.8525922608554795	0.9212269050091951	0.7319885275424738	104947
100	6611661	104948	0.837938788733468	0.9184262682471319	0.7201471204787133	104948
100	6716608	104947	0.8521920588487523	0.9211220901979095	0.7304544198500196	104947
100	6716608	104948	0.8378530319777413	0.9177878568433891	0.7190036970690247	104948
100	6821555	104947	0.8526399039515183	0.921960608688195	0.7317026689662401	104947
100	6821555	104948	0.8377386896367724	0.9185310820596867	0.7193467240919312	104948
100	6926502	104947	0.8533545503921027	0.9213603056781042	0.7325126015989023	104947
100	6926502	104948	0.8382151160574761	0.9181308838662957	0.7195182376033845	104948
100	7031449	104947	0.8531735066271546	0.9212554908668185	0.7322934433571231	104947
100	7031449	104948	0.8380245454891947	0.918302397377749	0.7190894538247513	104948
100	7136396	104947	0.8529257625277521	0.9217414504464158	0.7330938473705775	104947
100	7136396	104948	0.8377291611083584	0.9180260700537409	0.7195372946602127	104948
100	7241343	104947	0.8528590621932975	0.9216080497775068	0.7335798069501749	104947
100	7241343	104948	0.8375004764264207	0.918302397377749	0.7200232496093303	104948
100	7346290	104947	0.8532592642000247	0.9213317198204808	0.7318265410159414	104947
100	7346290	104948	0.8380912451880932	0.918035598582155	0.7193848382055875	104948
100	7451237	104947	0.8539262675445701	0.9218557938769093	0.7327317598406815	104947
100	7451237	104948	0.8376910469947021	0.9186358958722415	0.7198898502115333	104948
100	7556184	104947	0.8541549544055571	0.9219415514497794	0.7329032749864217	104947
100	7556184	104948	0.8384914433814842	0.9185977817585852	0.719985135495674	104948
100	7661131	104947	0.8543264695512973	0.921598521158299	0.7342563389139279	104947
100	7661131	104948	0.8385200289667264	0.9186644814574837	0.7202519342912681	104948
100	7766078	104947	0.8537071093027909	0.921827208019286	0.7335798069501749	104947
100	7766078	104948	0.8377291611083584	0.9185501391165148	0.7204520333879636	104948
100	7871025	104947	0.8543169409320895	0.9221702383107664	0.7331033759897854	104947
100	7871025	104948	0.83740519114228	0.9189122231962495	0.71971833670008	104948
100	7975972	104947	0.8537833382564533	0.921731921827208	0.7334940493773048	104947
100	7975972	104948	0.8377863322788428	0.9187597667416244	0.720509204558448	104948
100	8080919	104947	0.8539643820214012	0.9219891945458184	0.7333511200891879	104947
100	8080919	104948	0.838596257194039	0.9185310820596867	0.7193657811487594	104948
100	8185866	104947	0.8542883550744662	0.9215222922046367	0.733617921427006	104947
100	8185866	104948	0.838729656591836	0.9181022982810535	0.7204901475016199	104948
100	8290813	104947	0.8544026985049596	0.9216461642543379	0.73325583389711	104947
100	8290813	104948	0.8386343713076952	0.91870259557114	0.7198231505126348	104948
100	8395760	104947	0.8552698028528686	0.9219224942113639	0.734342096486798	104947
100	8395760	104948	0.8386915424781797	0.9185215535312726	0.7209475168654953	104948
100	8500707	104947	0.8548791294653492	0.921731921827208	0.7344850257749149	104947
100	8500707	104948	0.8391774974272973	0.9186549529290696	0.720242405762854	104948
100	8605654	104947	0.8548981867037647	0.9216842787311691	0.7348375846856032	104947
100	8605654	104948	0.8383866295689294	0.9186454244006556	0.720642603956245	104948
100	8710601	104947	0.8548600722269336	0.9217605076848314	0.7347899415895642	104947
100	8710601	104948	0.8388535274612189	0.91870259557114	0.7212810153599878	104948
100	8815548	104947	0.8549934728958427	0.9219510800689872	0.7339990661953176	104947
100	8815548	104948	0.8394252391660632	0.9181308838662957	0.7205949613141747	104948
100	8920495	104947	0.8547266715580245	0.9219796659266106	0.7354664735533174	104947
100	8920495	104948	0.838729656591836	0.9186073102869993	0.719184739108892	104948
100	9025442	104947	0.8548219577501024	0.9219510800689872	0.7343992682020448	104947
100	9025442	104948	0.8382913442847887	0.9184834394176163	0.7209379883370812	104948
100	9130389	104947	0.8550506446110894	0.9219415514497794	0.7345040830133306	104947
100	9130389	104948	0.8390440980295003	0.9187788237984525	0.7202519342912681	104948
100	9235336	104947	0.8548696008461414	0.9221511810723508	0.7350567429273824	104947
100	9235336	104948	0.8393775965239928	0.918578724701757	0.7210809162632923	104948
100	9340283	104947	0.8553555604257387	0.9217700363040392	0.7346946553974864	104947
100	9340283	104948	0.8395205244502039	0.9187216526279681	0.7210904447917064	104948
100	9445230	104947	0.8553936749025699	0.9219224942113639	0.7352854297883694	104947
100	9445230	104948	0.839129854785227	0.9184834394176163	0.7212714868315737	104948
100	9550177	104947	0.8552221597568297	0.9215413494430522	0.7355903456030186	104947
100	9550177	104948	0.8396062812059305	0.9177402142013188	0.7210809162632923	104948
100	9655124	104947	0.8553460318065309	0.921960608688195	0.7363431065204341	104947
100	9655124	104948	0.8392823112398521	0.9182261691504364	0.7218908411784883	104948
100	9760071	104947	0.8553650890449465	0.9227133696056105	0.7375913556366547	104947
100	9760071	104948	0.8395967526775164	0.9186835385143118	0.7216049853260662	104948
100	9865018	104947	0.8553174459489076	0.9223893965525456	0.7352187294539149	104947
100	9865018	104948	0.839406182109235	0.9184929679460304	0.7202519342912681	104948
100	9969965	104947	0.8552888600912841	0.9224656255062079	0.7362287630899407	104947
100	9969965	104948	0.8397015664900713	0.9188741090825933	0.7211857300758471	104948
100	10074912	104947	0.8551745166607907	0.922055894880273	0.7360191334673692	104947
100	10074912	104948	0.8401494073255327	0.9186358958722415	0.7204425048595495	104948
100	10179859	104947	0.8555937759059334	0.9223798679333378	0.7359047900368758	104947
100	10179859	104948	0.8395014673933757	0.9186168388154133	0.7220147120478714	104948
100	10284806	104947	0.8558510486245439	0.9223322248372988	0.736028662086577	104947
100	10284806	104948	0.839920722643595	0.9186454244006556	0.7218717841216602	104948
100	10389753	104947	0.8556128331443491	0.9225704403174936	0.7357713893679667	104947
100	10389753	104948	0.8397492091321417	0.9184072111903038	0.7209856309791516	104948
100	10494700	104947	0.8556033045251412	0.9226561978903637	0.7366956654311224	104947
100	10494700	104948	0.840063650569806	0.9187502382132103	0.7216716850249647	104948
100	10599647	104947	0.8556985907172192	0.9222655245028443	0.7360572479442004	104947
100	10599647	104948	0.8398826085299387	0.9189217517246636	0.7213858291725426	104948
100	10704594	104947	0.8554032035217777	0.9225323258406625	0.7367909516232003	104947
100	10704594	104948	0.8393394824103365	0.9186263673438274	0.72130960094523	104948
100	10809541	104947	0.8556795334788035	0.9223131675988833	0.7369910526265638	104947
100	10809541	104948	0.8395967526775164	0.9188074093836948	0.7213000724168159	104948
100	10914488	104947	0.8558986917205827	0.9223893965525456	0.7355808169838108	104947
100	10914488	104948	0.8400827076266342	0.9190837367077028	0.7212047871326752	104948
100	11019435	104947	0.8555461328098946	0.9222178814068054	0.7371149246762652	104947
100	11019435	104948	0.8395967526775164	0.9183881541334756	0.7218241414795898	104948
100	11124382	104947	0.8555366041906868	0.9220273090226495	0.7369624667689405	104947
100	11124382	104948	0.8398444944162824	0.9185501391165148	0.7220242405762854	104948
100	11229329	104947	0.8561273785815697	0.9226561978903637	0.7360667765634082	104947
100	11229329	104948	0.8399588367572512	0.9194172352021954	0.7221099973320121	104948
100	11334276	104947	0.8559082203397905	0.92237033931413	0.7374674835869535	104947
100	11334276	104948	0.8396062812059305	0.9189503373099058	0.7211571444906049	104948
100	11439223	104947	0.8553936749025699	0.9224179824101689	0.7366766081927067	104947
100	11439223	104948	0.8401303502687045	0.9187597667416244	0.7225197240538171	104948
100	11544170	104947	0.8560892641047386	0.9222369386452209	0.7376580559711092	104947
100	11544170	104948	0.8402542211380875	0.9190456225940465	0.7224149102412624	104948
100	11649117	104947	0.8563084223465177	0.9216080497775068	0.7365432075237978	104947
100	11649117	104948	0.8395014673933757	0.9185310820596867	0.7217479132522773	104948
100	11754064	104947	0.8558034055285049	0.9221892955491819	0.736714722669538	104947
100	11754064	104948	0.8395491100354461	0.918712124099554	0.7222529252582232	104948
100	11859011	104947	0.855431789379401	0.922551383079078	0.7372483253451743	104947
100	11859011	104948	0.8396539238480009	0.9189217517246636	0.7218146129511758	104948
100	11963958	104947	0.8564894661114658	0.9227991271784806	0.7375532411598236	104947
100	11963958	104948	0.840320920836986	0.9193600640317109	0.7226531234516141	104948
100	12068905	104947	0.8563274795849334	0.9223036389796755	0.7370482243418106	104947
100	12068905	104948	0.840197049967603	0.9188074093836948	0.7220814117467699	104948
100	12173852	104947	0.856479937492258	0.9220177804034417	0.736762365765577	104947
100	12173852	104948	0.8400064793993216	0.9185120250028586	0.7222243396729809	104948
100	12278799	104947	0.8562798364888944	0.9223512820757144	0.7373150256796288	104947
100	12278799	104948	0.8399493082288372	0.9192076075770858	0.7224339672980905	104948
100	12383746	104947	0.85638465130018	0.9225799689367014	0.7371720963915119	104947
100	12383746	104948	0.8400064793993216	0.918969394366734	0.7219289552921447	104948
100	12488693	104947	0.8564894661114658	0.921503234966221	0.7376389987326937	104947
100	12488693	104948	0.8402637496665015	0.9183119259061631	0.7225197240538171	104948
100	12593640	104947	0.8564037085385957	0.9223322248372988	0.7379629717857585	104947
100	12593640	104948	0.8406734763883066	0.9189979799519762	0.7225101955254031	104948
100	12698587	104947	0.8558986917205827	0.9221892955491819	0.7373912546332911	104947
100	12698587	104948	0.8399874223424935	0.9187502382132103	0.7227579372641689	104948
100	12803534	104947	0.8566800384956216	0.9224942113638313	0.7385251603190182	104947
100	12803534	104948	0.8409688607691428	0.9191694934634295	0.7231200213439036	104948
100	12908481	104947	0.8571278835983878	0.9219320228305716	0.7379915576433819	104947
100	12908481	104948	0.8403685634790563	0.9188550520257651	0.7227769943209971	104948
100	13013428	104947	0.8561750216776087	0.9220654234994807	0.7385823320342649	104947
100	13013428	104948	0.8404257346495407	0.918978922895148	0.7231581354575599	104948
100	13118375	104947	0.8561559644391931	0.9224560968870001	0.73817260140833	104947
100	13118375	104948	0.840730647558791	0.9188074093836948	0.7227198231505126	104948
100	13223322	104947	0.8572708128865046	0.9223989251717534	0.7378581569744729	104947
100	13223322	104948	0.8402446926096734	0.9189312802530777	0.7237203186339901	104948
100	13328269	104947	0.8565085233498814	0.9221988241683897	0.7379343859281352	104947
100	13328269	104948	0.8402732781949156	0.9187311811563822	0.7233201204405991	104948
100	13433216	104947	0.8567276815916606	0.9222083527875975	0.7381154296930832	104947
100	13433216	104948	0.8407401760872051	0.9189884514235621	0.723958531844342	104948
100	13538163	104947	0.8567372102108683	0.9220368376418573	0.738124958312291	104947
100	13538163	104948	0.8404924343484392	0.9186263673438274	0.7236536189350916	104948
100	13643110	104947	0.8562703078696866	0.9224275110293767	0.7384298741269403	104947
100	13643110	104948	0.8409307466554865	0.9189312802530777	0.7243968441513893	104948
100	13748057	104947	0.8561559644391931	0.9225323258406625	0.737715227686356	104947
100	13748057	104948	0.8397682661889698	0.9190075084803903	0.7234154057247398	104948
100	13853004	104947	0.8563370082041412	0.9225704403174936	0.7384679886037714	104947
100	13853004	104948	0.8401779929107749	0.9187407096847963	0.7244635438502878	104948
100	13957951	104947	0.8563655940617645	0.9225227972214547	0.737257853964382	104947
100	13957951	104948	0.8401684643823608	0.9191409078781873	0.7230723787018333	104948
100	14062898	104947	0.8564608802538424	0.922275053122052	0.738620446511096	104947
100	14062898	104948	0.8405877196325799	0.918978922895148	0.7244254297366315	104948
100	14167845	104947	0.8570421260255177	0.9218939083537404	0.7382774162196156	104947
100	14167845	104948	0.8412356595647368	0.9186835385143118	0.7231771925143881	104948
100	14272792	104947	0.8572326984096734	0.9224942113638313	0.738668089607135	104947
100	14272792	104948	0.840730647558791	0.9189884514235621	0.7237774898044746	104948
100	14377739	104947	0.8567467388300761	0.9226752551287792	0.7379248573089273	104947
100	14377739	104948	0.840320920836986	0.9191313793497732	0.7236822045203338	104948
100	14482686	104947	0.8570135401678942	0.9223036389796755	0.7384584599845636	104947
100	14482686	104948	0.8408164043145177	0.9186740099858978	0.7240538171284827	104948
100	14587633	104947	0.8567753246876995	0.9221607096915586	0.738715732703174	104947
100	14587633	104948	0.840597248160994	0.9185977817585852	0.7238156039181309	104948
100	14692580	104947	0.8565466378267125	0.9222941103604677	0.738401288269317	104947
100	14692580	104948	0.8402256355528452	0.9185310820596867	0.7240347600716546	104948
100	14797527	104947	0.8569754256910631	0.92237033931413	0.7381535441699143	104947
100	14797527	104948	0.840463848763197	0.9190075084803903	0.7233201204405991	104948
100	14902474	104947	0.8567753246876995	0.9223512820757144	0.7396018942894985	104947
100	14902474	104948	0.8407878187292754	0.9189026946678355	0.7245397720776003	104948
100	15007421	104947	0.8567753246876995	0.9225227972214547	0.7387443185607974	104947
100	15007421	104948	0.8409116895986584	0.9189408087814918	0.7236440904066775	104948
100	15112368	104947	0.8571469408368033	0.9222083527875975	0.738896776468122	104947
100	15112368	104948	0.8408735754850021	0.9186549529290696	0.7244349582650456	104948
100	15217315	104947	0.8563941799193879	0.9222464672644287	0.738401288269317	104947
100	15217315	104948	0.8406544193314784	0.9188645805541792	0.7241681594694516	104948
100	15322262	104947	0.857347041840167	0.9223608106949222	0.7397257663391998	104947
100	15322262	104948	0.8412833022068071	0.9191790219918435	0.7243301444524908	104948
100	15427209	104947	0.8571374122175955	0.9225418544598702	0.7384298741269403	104947
100	15427209	104948	0.8413785874909478	0.918835994968937	0.7231390784007318	104948
100	15532156	104947	0.8568134391645307	0.9223512820757144	0.7390492343754467	104947
100	15532156	104948	0.8404829058200252	0.918969394366734	0.7245302435491863	104948
100	15637103	104947	0.8571088263599722	0.9229896995626363	0.7389348909449531	104947
100	15637103	104948	0.8409307466554865	0.9195601631284065	0.7244540153218737	104948
100	15742050	104947	0.8569944829294787	0.922008251784234	0.7390206485178232	104947
100	15742050	104948	0.8407497046156192	0.9189217517246636	0.7253401684643823	104948
100	15846997	104947	0.8576138431779851	0.9229706423242208	0.7394684936205894	104947
100	15846997	104948	0.8406353622746503	0.9190170370088043	0.7242634447535923	104948
100	15951944	104947	0.857347041840167	0.9217605076848314	0.7392207495211869	104947
100	15951944	104948	0.841130845752182	0.9187978808552807	0.7241205168273812	104948
100	16056891	104947	0.8568324964029462	0.9225990261751169	0.740030682153849	104947
100	16056891	104948	0.8405781911041659	0.9188931661394214	0.7250447840835461	104948
100	16161838	104947	0.857299398744128	0.9216652214927535	0.7392588639980181	104947
100	16161838	104948	0.8409879178259709	0.9183405114914053	0.7249113846857491	104948
100	16266785	104947	0.8573946849362059	0.9224656255062079	0.7393827360477193	104947
100	16266785	104948	0.840997446354385	0.9191885505202576	0.725168654952929	104948
100	16371732	104947	0.8566323953995827	0.9225132686022468	0.739716237719992	104947
100	16371732	104948	0.8406734763883066	0.9189503373099058	0.7245016579639441	104948
100	16476679	104947	0.8563655940617645	0.9229039419897662	0.7390682916138622	104947
100	16476679	104948	0.8405210199336814	0.9189979799519762	0.7244730723787018	104948
100	16581626	104947	0.8575471428435305	0.9223893965525456	0.7409549582170047	104947
100	16581626	104948	0.8417216145138545	0.919245721690742	0.7253401684643823	104948
100	16686573	104947	0.8572517556480891	0.9226561978903637	0.7408977865017581	104947
100	16686573	104948	0.8404447917063689	0.9191885505202576	0.7250161984983039	104948
100	16791520	104947	0.8569182539758163	0.9222655245028443	0.7402879548724595	104947
100	16791520	104948	0.8409116895986584	0.9192266646339139	0.7244444867934596	104948
100	16896467	104947	0.8574899711282838	0.9224179824101689	0.7395066080974206	104947
100	16896467	104948	0.841264245149979	0.919626862827305	0.7249590273278195	104948
100	17001414	104947	0.8573851563169981	0.9223322248372988	0.7399068101041478	104947
100	17001414	104948	0.8412166025079086	0.9190646796508747	0.725292525822312	104948
100	17106361	104947	0.857480442509076	0.9226561978903637	0.7398782242465244	104947
100	17106361	104948	0.8412070739794946	0.9194172352021954	0.7248542135152647	104948
100	17211308	104947	0.8573565704593747	0.9226466692711559	0.740993072693836	104947
100	17211308	104948	0.8403876205358844	0.9191504364066013	0.725292525822312	104948
100	17316255	104947	0.8570992977407644	0.9218176794000781	0.7400592680114725	104947
100	17316255	104948	0.8411975454510805	0.9189884514235621	0.7250924267256165	104948
100	17421202	104947	0.8567276815916606	0.9232660295196623	0.7394208505245505	104947
100	17421202	104948	0.8409307466554865	0.9194553493158517	0.7249304417425773	104948
100	17526149	104947	0.8573279846017514	0.9223989251717534	0.7405642848294853	104947
100	17526149	104948	0.8413595304341197	0.9190932652361169	0.7248542135152647	104948
100	17631096	104947	0.8573851563169981	0.9227991271784806	0.7413646888429398	104947
100	17631096	104948	0.8406925334451347	0.9192933643328124	0.7262358501353051	104948
100	17736043	104947	0.8572708128865046	0.922961113705013	0.7404594700181997	104947
100	17736043	104948	0.8411975454510805	0.9192171361054998	0.7253973396348667	104948
100	17840990	104947	0.8574994997474916	0.9224275110293767	0.7406595710215632	104947
100	17840990	104948	0.8411403742805961	0.9192647787475702	0.7260357510386096	104948
100	17945937	104947	0.8574613852706604	0.9229992281818442	0.7404118269221607	104947
100	17945937	104948	0.8412261310363227	0.9191218508213591	0.7256355528452186	104948
100	18050884	104947	0.8568324964029462	0.9227895985592728	0.7413456316045242	104947
100	18050884	104948	0.8408164043145177	0.9196554484125472	0.7255116819758356	104948
100	18155831	104947	0.8575376142243227	0.9231135716123376	0.7413170457469008	104947
100	18155831	104948	0.8405781911041659	0.9192647787475702	0.7257975378282578	104948
100	18260778	104947	0.856708624353245	0.9227705413208572	0.7402498403956282	104947
100	18260778	104948	0.8408068757861036	0.9194458207874376	0.7251877120097572	104948
100	18365725	104947	0.8564037085385957	0.9229420564665974	0.7412027023164073	104947
100	18365725	104948	0.841130845752182	0.9191790219918435	0.7266836909707665	104948
100	18470672	104947	0.8570611832639332	0.9228848847513507	0.7409263723593814	104947
100	18470672	104948	0.8411689598658383	0.9193410069748827	0.7259881083965393	104948
100	18575619	104947	0.8567372102108683	0.9224846827446235	0.7405547562102776	104947
100	18575619	104948	0.8402256355528452	0.918835994968937	0.7259214086976408	104948
100	18680566	104947	0.857070711883141	0.9226847837479871	0.740488055875823	104947
100	18680566	104948	0.8412928307352212	0.9187978808552807	0.7257498951861875	104948
100	18785513	104947	0.8562321933928554	0.921731921827208	0.7403165407300828	104947
100	18785513	104948	0.8403399778938141	0.9184262682471319	0.7248065708731943	104948
100	18890460	104947	0.8563560654425567	0.9227228982248182	0.7405261703526542	104947
100	18890460	104948	0.8406353622746503	0.9182166406220224	0.7257498951861875	104948
100	18995407	104947	0.856432294396219	0.9222178814068054	0.7413361029853164	104947
100	18995407	104948	0.8416644433433701	0.9189979799519762	0.7258832945839845	104948
100	19100354	104947	0.8566323953995827	0.9224942113638313	0.7409073151209659	104947
100	19100354	104948	0.8412737736783931	0.918978922895148	0.7261024507375081	104948
100	19205301	104947	0.8573279846017514	0.9222845817412598	0.740535698971862	104947
100	19205301	104948	0.8406163052178222	0.9186263673438274	0.7259214086976408	104948
100	19310248	104947	0.8566323953995827	0.9220368376418573	0.741717247753628	104947
100	19310248	104948	0.8398921370583528	0.9186930670427259	0.7262549071921333	104948
100	19415195	104947	0.8557271765748425	0.9223798679333378	0.7409263723593814	104947
100	19415195	104948	0.8399969508709075	0.9186549529290696	0.7261024507375081	104948
100	19520142	104947	0.8565466378267125	0.9226561978903637	0.7406976854983944	104947
100	19520142	104948	0.8409688607691428	0.9187216526279681	0.7253211114075542	104948
100	19625089	104947	0.8566323953995827	0.9226561978903637	0.7404118269221607	104947
100	19625089	104948	0.840864046956588	0.9187597667416244	0.7261405648511644	104948
100	19730036	104947	0.8556795334788035	0.9224751541254157	0.7407548572136412	104947
100	19730036	104948	0.8407782902008614	0.9187788237984525	0.726226321606891	104948
100	19834983	104947	0.8568324964029462	0.9223512820757144	0.7401545542035504	104947
100	19834983	104948	0.8412547166215649	0.918978922895148	0.7254830963905934	104948
100	19939930	104947	0.856842025022154	0.9227514840824416	0.7409359009785892	104947
100	19939930	104948	0.8414834013035026	0.918969394366734	0.7267027480275946	104948
100	20044877	104947	0.8569849543102709	0.9225323258406625	0.7418220625649137	104947
100	20044877	104948	0.8410546175248694	0.9184072111903038	0.7263311354194458	104948
100	20149824	104947	0.8559177489589983	0.9225609116982858	0.7420888639027319	104947
100	20149824	104948	0.8408068757861036	0.9185501391165148	0.7264645348172428	104948
100	20254771	104947	0.8570897691215565	0.9229515850858052	0.7409168437401736	104947
100	20254771	104948	0.8405686625757518	0.9188931661394214	0.7261500933795785	104948
100	20359718	104947	0.8560225637702841	0.9219510800689872	0.7413265743661086	104947
100	20359718	104948	0.8414357586614323	0.9180165415253269	0.7267694477264931	104948
100	20464665	104947	0.8562607792504788	0.9225609116982858	0.7409644868362125	104947
100	20464665	104948	0.8410927316385257	0.9185120250028586	0.7261500933795785	104948
100	20569612	104947	0.8566514526379982	0.9221702383107664	0.7415647898463034	104947
100	20569612	104948	0.8406734763883066	0.918168997979952	0.7263120783626177	104948
100	20674559	104947	0.8569277825950241	0.9224179824101689	0.7414599750350177	104947
100	20674559	104948	0.8409783892975569	0.9180641841673972	0.7273316309029233	104948
100	20779506	104947	0.8566419240187905	0.9227038409864027	0.7430607830619265	104947
100	20779506	104948	0.8407973472576895	0.918578724701757	0.7268647330106338	104948
100	20884453	104947	0.85638465130018	0.9223512820757144	0.7423937797173812	104947
100	20884453	104948	0.840864046956588	0.9183405114914053	0.7265884056866257	104948
100	20989400	104947	0.8570230687871021	0.9225894975559091	0.7420316921874851	104947
100	20989400	104948	0.8409307466554865	0.9183595685482334	0.7261119792659222	104948
100	21094347	104947	0.856613338161167	0.9224465682677924	0.7417077191344202	104947
100	21094347	104948	0.8407211190303769	0.9184739108892023	0.7257975378282578	104948
100	21199294	104947	0.8569277825950241	0.9226085547943248	0.7415743184655111	104947
100	21199294	104948	0.8416263292297138	0.9186358958722415	0.7266932194991805	104948
100	21304241	104947	0.8579378162310499	0.9214079487741431	0.7412408167932385	104947
100	21304241	104948	0.8418359568548234	0.9176068148035218	0.7267789762549072	104948
100	21409188	104947	0.8571088263599722	0.9229420564665974	0.740945429597797	104947
100	21409188	104948	0.8412451880931509	0.9182261691504364	0.7262739642489614	104948
100	21514135	104947	0.8570992977407644	0.9221797669299742	0.741221759554823	104947
100	21514135	104948	0.840997446354385	0.9187216526279681	0.7257594237146016	104948
100	21619082	104947	0.8573851563169981	0.9228181844168961	0.7415457326078878	104947
100	21619082	104948	0.8411689598658383	0.9187978808552807	0.7263883065899303	104948
100	21724029	104947	0.856708624353245	0.9218081507808703	0.7422032073332253	104947
100	21724029	104948	0.8410927316385257	0.9179117277127721	0.7271220032778137	104948
100	21828976	104947	0.8570516546447254	0.9226371406519481	0.7412027023164073	104947
100	21828976	104948	0.8410641460532835	0.9179117277127721	0.7259214086976408	104948
100	21933923	104947	0.8569087253566086	0.9227705413208572	0.7408596720249269	104947
100	21933923	104948	0.8407115905019629	0.9182833403209208	0.7253878111064527	104948
100	22038870	104947	0.8566705098764138	0.9218748511153249	0.7418697056609527	104947
100	22038870	104948	0.8406448908030644	0.9181594694515379	0.7269600182947745	104948
100	22143817	104947	0.8571945839328423	0.9221988241683897	0.7416219615615501	104947
100	22143817	104948	0.8410069748827991	0.9181880550367801	0.7261405648511644	104948
100	22248764	104947	0.8565752236843359	0.9227419554632338	0.7416410187999657	104947
100	22248764	104948	0.8417216145138545	0.9184548538323741	0.7269600182947745	104948
100	22353711	104947	0.8575376142243227	0.9218176794000781	0.7409644868362125	104947
100	22353711	104948	0.8413881160193619	0.9178259709570454	0.726226321606891	104948
100	22458658	104947	0.8579378162310499	0.9224465682677924	0.7420888639027319	104947
100	22458658	104948	0.8414167016046041	0.918569196173343	0.7261405648511644	104948
100	22563605	104947	0.8566800384956216	0.9217414504464158	0.7418030053264981	104947
100	22563605	104948	0.8401017646834623	0.9181308838662957	0.726216793078477	104948
100	22668552	104947	0.8569182539758163	0.9220749521186885	0.7420031063298618	104947
100	22668552	104948	0.8403495064222282	0.9181785265083661	0.7263406639478599	104948
100	22773499	104947	0.8571469408368033	0.9227895985592728	0.7423937797173812	104947
100	22773499	104948	0.8409116895986584	0.918712124099554	0.7269028471242901	104948
100	22878446	104947	0.8570802405023488	0.9225894975559091	0.7423270793829266	104947
100	22878446	104948	0.8409593322407287	0.9186073102869993	0.7261405648511644	104948
100	22983393	104947	0.856842025022154	0.9220463662610651	0.7425271803862902	104947
100	22983393	104948	0.8410165034112131	0.9180927697526394	0.7261500933795785	104948
100	23088340	104947	0.8568515536413618	0.92237033931413	0.7422984935253032	104947
100	23088340	104948	0.8408831040134161	0.9182356976788505	0.7265121774593132	104948
100	23193287	104947	0.8567372102108683	0.9227991271784806	0.7416505474191735	104947
100	23193287	104948	0.8409212181270724	0.9184643823607882	0.7270076609368449	104948
100	23298234	104947	0.8569468398334398	0.9221511810723508	0.7421460356179786	104947
100	23298234	104948	0.8406734763883066	0.9180165415253269	0.726883790067462	104948
100	23403181	104947	0.8575376142243227	0.9223036389796755	0.7408501434057191	104947
100	23403181	104948	0.8414262301330182	0.9182071120936083	0.7267885047833212	104948
100	23508128	104947	0.8567467388300761	0.9225227972214547	0.74235566524055	104947
100	23508128	104948	0.8393680679955787	0.9182452262072646	0.7265598201013835	104948
100	23613075	104947	0.8567276815916606	0.9223512820757144	0.7421841500948098	104947
100	23613075	104948	0.8407592331440332	0.9178164424286314	0.7266455768571102	104948
100	23718022	104947	0.8572041125520501	0.9224656255062079	0.7420221635682773	104947
100	23718022	104948	0.8412833022068071	0.9182261691504364	0.7271315318062278	104948
100	23822969	104947	0.8575852573203617	0.921731921827208	0.7424318941942123	104947
100	23822969	104948	0.8413118877920494	0.9178926706559439	0.7273697450165797	104948
100	23927916	104947	0.8568610822605697	0.9224942113638313	0.7427463386280694	104947
100	23927916	104948	0.8408068757861036	0.918578724701757	0.7270076609368449	104948
100	24032863	104947	0.8571088263599722	0.922503739983039	0.7424700086710435	104947
100	24032863	104948	0.8404733772916111	0.9184548538323741	0.7270553035789152	104948
100	24137810	104947	0.8565180519690891	0.9219701373074027	0.7428606820585629	104947
100	24137810	104948	0.8402065784960171	0.9182738117925068	0.7271124747493997	104948
100	24242757	104947	0.8561940789160243	0.9225799689367014	0.741717247753628	104947
100	24242757	104948	0.8400160079277357	0.9179212562411861	0.7264264207035865	104948
100	24347704	104947	0.8563370082041412	0.9217509790656236	0.741488560892641	104947
100	24347704	104948	0.8398159088310402	0.9175972862751077	0.7274745588291345	104948
100	24452651	104947	0.857299398744128	0.921960608688195	0.7423175507637189	104947
100	24452651	104948	0.8410546175248694	0.9179593703548424	0.7268075618401494	104948
100	24557598	104947	0.8569182539758163	0.9228372416553118	0.7418601770417449	104947
100	24557598	104948	0.8408735754850021	0.9188550520257651	0.7269409612379464	104948
100	24662545	104947	0.8570135401678942	0.9223798679333378	0.7415933757039267	104947
100	24662545	104948	0.8403018637801578	0.9181404123947098	0.726626519800282	104948
100	24767492	104947	0.8568134391645307	0.9223989251717534	0.7424414228134201	104947
100	24767492	104948	0.840054122041392	0.9178354994854595	0.7266836909707665	104948
100	24872439	104947	0.8578711158965954	0.9222274100260132	0.7404213555413685	104947
100	24872439	104948	0.8410355604680413	0.9181404123947098	0.7259785798681252	104948
100	24977386	104947	0.8572041125520501	0.9216461642543379	0.7421555642371864	104947
100	24977386	104948	0.8405305484620955	0.9179879559400846	0.7272172885619544	104948
100	25082333	104947	0.8567753246876995	0.9224656255062079	0.7433466416381602	104947
100	25082333	104948	0.8407497046156192	0.9181880550367801	0.7272268170903685	104948
100	25187280	104947	0.8565371092075047	0.9224179824101689	0.7422699076676799	104947
100	25187280	104948	0.8402637496665015	0.9183881541334756	0.7266741624423524	104948
100	25292227	104947	0.8570992977407644	0.9224465682677924	0.7408215575480958	104947
100	25292227	104948	0.8412833022068071	0.9179974844684987	0.7264645348172428	104948
100	25397174	104947	0.857661486274024	0.9212936053436497	0.7422127359524331	104947
100	25397174	104948	0.8406830049167207	0.917244730723787	0.7273221023745092	104948
100	25502121	104947	0.8563179509657256	0.9220654234994807	0.7417363049920436	104947
100	25502121	104948	0.8413214163204634	0.9180641841673972	0.7270934176925715	104948
100	25607068	104947	0.8562703078696866	0.9221702383107664	0.7425176517670824	104947
100	25607068	104948	0.8402351640812593	0.918168997979952	0.72767465792583	104948
100	25712015	104947	0.8570516546447254	0.9227133696056105	0.7424795372902513	104947
100	25712015	104948	0.8410832031101116	0.9179784274116706	0.7269981324084308	104948
100	25816962	104947	0.8559939779126606	0.9227038409864027	0.7418601770417449	104947
100	25816962	104948	0.8401017646834623	0.9186454244006556	0.7273602164881656	104948
100	25921909	104947	0.856937311214232	0.922055894880273	0.7416886618960047	104947
100	25921909	104948	0.8419502991957922	0.9185215535312726	0.7266646339139383	104948
100	26026856	104947	0.8562607792504788	0.9220844807378963	0.7432608840652901	104947
100	26026856	104948	0.8406544193314784	0.9179212562411861	0.7271410603346419	104948
100	26131803	104947	0.8572517556480891	0.922275053122052	0.7423175507637189	104947
100	26131803	104948	0.8405686625757518	0.9179593703548424	0.726883790067462	104948
100	26236750	104947	0.8571469408368033	0.9227419554632338	0.7436515574528095	104947
100	26236750	104948	0.8408545184281739	0.918435796775546	0.7279128711361817	104948
100	26341697	104947	0.8573565704593747	0.9223417534565066	0.7424223655750045	104947
100	26341697	104948	0.8413023592636353	0.9183690970766475	0.7277890002667988	104948
100	26446644	104947	0.8580140451847122	0.9227038409864027	0.741764890849667	104947
100	26446644	104948	0.8413309448488775	0.9186073102869993	0.7272077600335404	104948
100	26551591	104947	0.8582236748072837	0.921731921827208	0.7423461366213422	104947
100	26551591	104948	0.841264245149979	0.9182452262072646	0.7266074627434539	104948
100	26656538	104947	0.8579854593270889	0.9226085547943248	0.7442995035589393	104947
100	26656538	104948	0.8414452871898463	0.9188931661394214	0.7267980333117353	104948
100	26761485	104947	0.8564513516346346	0.9224084537909611	0.7424985945286668	104947
100	26761485	104948	0.8402351640812593	0.9185215535312726	0.7265217059877273	104948
100	26866432	104947	0.8566228667803748	0.9224084537909611	0.7424128369557967	104947
100	26866432	104948	0.8402542211380875	0.9183500400198193	0.7281034417044632	104948
100	26971379	104947	0.8571659980752189	0.9219129655921561	0.7436896719296406	104947
100	26971379	104948	0.8410546175248694	0.9183119259061631	0.7270457750505012	104948
100	27076326	104947	0.8571374122175955	0.9223512820757144	0.7427463386280694	104947
100	27076326	104948	0.8403113923085719	0.9183119259061631	0.7269314327095323	104948
100	27181273	104947	0.857842530038972	0.9221130665955196	0.7429750254890564	104947
100	27181273	104948	0.8403876205358844	0.9180451271105691	0.7263787780615162	104948
100	27286220	104947	0.8564608802538424	0.9221511810723508	0.7423937797173812	104947
100	27286220	104948	0.8404924343484392	0.9181880550367801	0.7277127720394863	104948
100	27391167	104947	0.8579282876118421	0.9221892955491819	0.7436134429759783	104947
100	27391167	104948	0.8410832031101116	0.9189122231962495	0.7263311354194458	104948
100	27496114	104947	0.8573946849362059	0.9224465682677924	0.7430988975387577	104947
100	27496114	104948	0.8405781911041659	0.918969394366734	0.7273983306018219	104948
100	27601061	104947	0.8572136411712579	0.9224751541254157	0.7428511534393551	104947
100	27601061	104948	0.8400160079277357	0.9185310820596867	0.7275889011701033	104948
100	27706008	104947	0.8567467388300761	0.9222941103604677	0.7426224665783682	104947
100	27706008	104948	0.839672980904829	0.9184834394176163	0.7269600182947745	104948
100	27810955	104947	0.8577281866084786	0.9212650194860262	0.7434228705918225	104947
100	27810955	104948	0.8406067766894081	0.9176830430308344	0.726750390669665	104948
100	27915902	104947	0.8574137421746215	0.9218653224961171	0.7424890659094591	104947
100	27915902	104948	0.841511986888745	0.9177973853718032	0.7264740633456569	104948
100	28020849	104947	0.8574423280322448	0.9215794639198833	0.7426034093399525	104947
100	28020849	104948	0.8402923352517437	0.9173590730647558	0.7268456759538057	104948
100	28125796	104947	0.8578711158965954	0.9217986221616625	0.7433180557805369	104947
100	28125796	104948	0.8412070739794946	0.9183786256050616	0.7278842855509395	104948
100	28230743	104947	0.85802357380392	0.9221511810723508	0.7428511534393551	104947
100	28230743	104948	0.8411594313374242	0.9183214544345771	0.7265407630445554	104948
100	28335690	104947	0.858118859995998	0.9219891945458184	0.7424414228134201	104947
100	28335690	104948	0.8412928307352212	0.9179403132980143	0.7271791744482982	104948
100	28440637	104947	0.8578997017542188	0.9225799689367014	0.7430798403003421	104947
100	28440637	104948	0.8414167016046041	0.9187407096847963	0.7273221023745092	104948
100	28545584	104947	0.856479937492258	0.921598521158299	0.7436515574528095	104947
100	28545584	104948	0.8410927316385257	0.917635400388764	0.7278461714372833	104948
100	28650531	104947	0.8576233717971928	0.9222941103604677	0.7431465406347966	104947
100	28650531	104948	0.8422361550482144	0.9190170370088043	0.7277508861531425	104948
100	28755478	104947	0.8578330014197643	0.921827208019286	0.7428416248201473	104947
100	28755478	104948	0.8409688607691428	0.917511529519381	0.7274555017723063	104948
100	28860425	104947	0.8577377152276864	0.9217700363040392	0.7441851601284458	104947
100	28860425	104948	0.8412547166215649	0.9177211571444907	0.7282654266875024	104948
100	28965372	104947	0.8583380182377771	0.9228086557976883	0.7438326012177575	104947
100	28965372	104948	0.841511986888745	0.9185024964744445	0.7281415558181195	104948
100	29070319	104947	0.8572517556480891	0.9219510800689872	0.7443185607973548	104947
100	29070319	104948	0.8404924343484392	0.9182642832640927	0.728484582841026	104948
100	29175266	104947	0.8577281866084786	0.9221511810723508	0.7429273823930175	104947
100	29175266	104948	0.8413214163204634	0.9178545565422876	0.7271220032778137	104948
100	29280213	104947	0.8579378162310499	0.9226561978903637	0.7423937797173812	104947
100	29280213	104948	0.8410641460532835	0.9187407096847963	0.7279795708350802	104948
100	29385160	104947	0.857842530038972	0.9228277130361039	0.744175631509238	104947
100	29385160	104948	0.8421218127072455	0.9189884514235621	0.727407859130236	104948
100	29490107	104947	0.8571564694560111	0.9226085547943248	0.7432799413037057	104947
100	29490107	104948	0.8407115905019629	0.9184262682471319	0.7274936158859626	104948
100	29595054	104947	0.8582236748072837	0.9223512820757144	0.7432037123500433	104947
100	29595054	104948	0.8412166025079086	0.9184834394176163	0.72716011739147	104948
100	29700001	104947	0.8581569744728291	0.9218653224961171	0.7421174497603552	104947
100	29700001	104948	0.8408354613713458	0.9186263673438274	0.7264073636467584	104948
100	29804948	104947	0.8587001057676732	0.9229134706089741	0.7438516584561731	104947
100	29804948	104948	0.8413404733772916	0.9182928688493349	0.7280557990623928	104948
100	29909895	104947	0.8565371092075047	0.9226657265095715	0.7428035103433162	104947
100	29909895	104948	0.8407878187292754	0.9189122231962495	0.727026717993673	104948
100	30014842	104947	0.8581760317112447	0.9222369386452209	0.7435943857375628	104947
100	30014842	104948	0.8407401760872051	0.9187311811563822	0.728084384647635	104948
100	30119789	104947	0.8574232707938293	0.922551383079078	0.7426605810551993	104947
100	30119789	104948	0.8409116895986584	0.9183881541334756	0.7274459732438923	104948
100	30224736	104947	0.8573089273633357	0.9222941103604677	0.7429654968698486	104947
100	30224736	104948	0.8412928307352212	0.9181594694515379	0.7272363456187826	104948
100	30329683	104947	0.8577853583237253	0.9219796659266106	0.7421365069987708	104947
100	30329683	104948	0.8402923352517437	0.9184453253039601	0.7270457750505012	104948
100	30434630	104947	0.8572041125520501	0.9221035379763118	0.7435753284991472	104947
100	30434630	104948	0.8404066775927126	0.9185977817585852	0.728208255517018	104948
100	30539577	104947	0.8581950889496603	0.9226276120327404	0.7432227695884589	104947
100	30539577	104948	0.8410165034112131	0.9182928688493349	0.7273506879597514	104948
100	30644524	104947	0.8578520586581798	0.922961113705013	0.7426129379591604	104947
100	30644524	104948	0.8408259328429317	0.9185977817585852	0.7266932194991805	104948
100	30749471	104947	0.8584714189066862	0.9225704403174936	0.7430036113466798	104947
100	30749471	104948	0.8403780920074704	0.9188074093836948	0.7273602164881656	104948
100	30854418	104947	0.8583856613338161	0.9226085547943248	0.743632500214394	104947
100	30854418	104948	0.8408449898997599	0.918712124099554	0.7279033426077677	104948
100	30959365	104947	0.8584237758106473	0.9219415514497794	0.7437849581217186	104947
100	30959365	104948	0.8411594313374242	0.9185120250028586	0.726483591874071	104948
100	31064312	104947	0.857299398744128	0.9227895985592728	0.7423937797173812	104947
100	31064312	104948	0.8407401760872051	0.918035598582155	0.7263216068910318	104948
100	31169259	104947	0.8576424290356085	0.9220368376418573	0.7446234766120041	104947
100	31169259	104948	0.8408164043145177	0.9179212562411861	0.7273030453176811	104948
100	31274206	104947	0.858528590621933	0.9225227972214547	0.74408034531716	104947
100	31274206	104948	0.8402256355528452	0.9190551511224606	0.7277127720394863	104948
100	31379153	104947	0.8581569744728291	0.9225323258406625	0.7428892679161863	104947
100	31379153	104948	0.8401589358539467	0.918978922895148	0.7276937149826581	104948
100	31484100	104947	0.8574137421746215	0.9225227972214547	0.7430417258235109	104947
100	31484100	104948	0.8408735754850021	0.9185501391165148	0.7279605137782521	104948
100	31589047	104947	0.8583666040954006	0.9217605076848314	0.7438326012177575	104947
100	31589047	104948	0.8409212181270724	0.91870259557114	0.728074856119221	104948
100	31693994	104947	0.8576710148932318	0.9227419554632338	0.7433656988765758	104947
100	31693994	104948	0.8410927316385257	0.9190646796508747	0.7271124747493997	104948
100	31798941	104947	0.8580807455191668	0.9222369386452209	0.7438802443137965	104947
100	31798941	104948	0.8409021610702443	0.9188645805541792	0.728341654914815	104948
100	31903888	104947	0.8571945839328423	0.9229229992281819	0.7441470456516146	104947
100	31903888	104948	0.8402923352517437	0.9193219499180546	0.7277794717383848	104948
100	32008835	104947	0.8584237758106473	0.9224084537909611	0.7425652948631214	104947
100	32008835	104948	0.8410546175248694	0.9189408087814918	0.7280272134771506	104948
100	32113782	104947	0.8582808465225304	0.9224751541254157	0.7436134429759783	104947
100	32113782	104948	0.8406639478598925	0.9178736135991158	0.7281987269886039	104948
100	32218729	104947	0.8575566714627383	0.9226371406519481	0.7436134429759783	104947
100	32218729	104948	0.839539581507032	0.9192647787475702	0.7273125738460952	104948
100	32323676	104947	0.8573851563169981	0.9223798679333378	0.7439088301714198	104947
100	32323676	104948	0.8406163052178222	0.9184643823607882	0.7271410603346419	104948
100	32428623	104947	0.8580616882807512	0.9224465682677924	0.7439469446482511	104947
100	32428623	104948	0.8407020619735488	0.9191218508213591	0.7275126729427908	104948
100	32533570	104947	0.8579282876118421	0.9222655245028443	0.7427749244856928	104947
100	32533570	104948	0.8397110950184854	0.918302397377749	0.7274840873575485	104948
100	32638517	104947	0.8575185569859072	0.9227038409864027	0.7428130389625239	104947
100	32638517	104948	0.8405210199336814	0.918168997979952	0.727684186454244	104948
100	32743464	104947	0.8577948869429332	0.9220273090226495	0.7436610860720173	104947
100	32743464	104948	0.8402256355528452	0.9180832412242254	0.7275031444143767	104948
100	32848411	104947	0.8581283886152058	0.9221416524531431	0.7437849581217186	104947
100	32848411	104948	0.8401779929107749	0.9184072111903038	0.7282273125738461	104948
100	32953358	104947	0.8582427320456992	0.9218653224961171	0.7431560692540043	104947
100	32953358	104948	0.8398635514731105	0.9181308838662957	0.7273316309029233	104948
100	33058305	104947	0.8582808465225304	0.9222941103604677	0.7440994025555757	104947
100	33058305	104948	0.8409498037123146	0.9183595685482334	0.7269886038800167	104948
100	33163252	104947	0.858480947525894	0.922503739983039	0.7423747224789655	104947
100	33163252	104948	0.8403876205358844	0.9188550520257651	0.7271791744482982	104948
100	33268199	104947	0.8581283886152058	0.9224465682677924	0.7428511534393551	104947
100	33268199	104948	0.8405210199336814	0.9183595685482334	0.7274745588291345	104948
100	33373146	104947	0.8571945839328423	0.9223036389796755	0.7436515574528095	104947
100	33373146	104948	0.8394347676944772	0.9188264664405229	0.7276556008690018	104948
100	33478093	104947	0.8577948869429332	0.9221511810723508	0.7441470456516146	104947
100	33478093	104948	0.8406639478598925	0.918168997979952	0.7273411594313374	104948
100	33583040	104947	0.856937311214232	0.9221130665955196	0.7428416248201473	104947
100	33583040	104948	0.8392632541830239	0.9182166406220224	0.7271029462209856	104948
100	33687987	104947	0.8584904761451018	0.9219415514497794	0.7445758335159652	104947
100	33687987	104948	0.8400827076266342	0.9179879559400846	0.7274364447154782	104948
100	33792934	104947	0.8574613852706604	0.9221607096915586	0.7445758335159652	104947
100	33792934	104948	0.8395491100354461	0.9178259709570454	0.7280653275908069	104948
100	33897881	104947	0.857347041840167	0.9225704403174936	0.743765900883303	104947
100	33897881	104948	0.8395491100354461	0.9181499409231239	0.7275031444143767	104948
100	34002828	104947	0.8574518566514526	0.9224370396485845	0.7430321972043031	104947
100	34002828	104948	0.8400922361550482	0.9184929679460304	0.727017189465259	104948
100	34107775	104947	0.8575947859395695	0.9225418544598702	0.7432799413037057	104947
100	34107775	104948	0.8405591340473377	0.9189979799519762	0.726883790067462	104948
100	34212722	104947	0.8576329004164006	0.9227895985592728	0.7438802443137965	104947
100	34212722	104948	0.839787323245798	0.9186168388154133	0.7267885047833212	104948
100	34317669	104947	0.8582141461880759	0.9226752551287792	0.7437087291680563	104947
100	34317669	104948	0.8394157106376491	0.9178164424286314	0.7272839882608529	104948
100	34422616	104947	0.856432294396219	0.9222464672644287	0.7429273823930175	104947
100	34422616	104948	0.8396348667911727	0.918569196173343	0.7277318290963144	104948
100	34527563	104947	0.8577472438468942	0.9227705413208572	0.7435753284991472	104947
100	34527563	104948	0.8395205244502039	0.9185120250028586	0.7260643366238518	104948
100	34632510	104947	0.857299398744128	0.9215604066814678	0.7429845541082641	104947
100	34632510	104948	0.8395109959217898	0.9179593703548424	0.7274650303007204	104948
100	34737457	104947	0.8567467388300761	0.9220654234994807	0.7429940827274719	104947
100	34737457	104948	0.8384533292678279	0.9179879559400846	0.727026717993673	104948
100	34842404	104947	0.8579568734694656	0.9222369386452209	0.7437087291680563	104947
100	34842404	104948	0.8396920379616573	0.9184643823607882	0.7278842855509395	104948
100	34947351	104947	0.857661486274024	0.9226371406519481	0.7435753284991472	104947
100	34947351	104948	0.8399493082288372	0.9186454244006556	0.727950985249838	104948
100	35052298	104947	0.85711835497918	0.9222369386452209	0.7443280894165627	104947
100	35052298	104948	0.8393394824103365	0.9186168388154133	0.7278652284941114	104948
100	35157245	104947	0.8563655940617645	0.9218081507808703	0.7445567762775496	104947
100	35157245	104948	0.8401494073255327	0.9185310820596867	0.727407859130236	104948
100	35262192	104947	0.8582808465225304	0.9220654234994807	0.7438040153601342	104947
100	35262192	104948	0.8408545184281739	0.918845523497351	0.7275126729427908	104948
100	35367139	104947	0.8555747186675179	0.9222083527875975	0.7429940827274719	104947
100	35367139	104948	0.8390726836147425	0.9184929679460304	0.7266551053855242	104948
100	35472086	104947	0.8564418230154268	0.9221130665955196	0.7441661028900302	104947
100	35472086	104948	0.8393585394671647	0.918569196173343	0.7278842855509395	104948
100	35577033	104947	0.8573756276977903	0.9220463662610651	0.74399458774429	104947
100	35577033	104948	0.8398921370583528	0.9183309829629912	0.7274650303007204	104948
100	35681980	104947	0.8564132371578035	0.9225418544598702	0.743765900883303	104947
100	35681980	104948	0.8391965544841254	0.9178450280138736	0.727017189465259	104948
100	35786927	104947	0.8565942809227515	0.9220940093571041	0.7433180557805369	104947
100	35786927	104948	0.8399683652856653	0.9186073102869993	0.7265407630445554	104948
100	35891874	104947	0.8565656950651281	0.9219034369729483	0.7440327022211212	104947
100	35891874	104948	0.8397396806037276	0.9180165415253269	0.7275793726416893	104948
100	35996821	104947	0.8567657960684917	0.9217128645887924	0.7436992005488484	104947
100	35996821	104948	0.8400064793993216	0.9183500400198193	0.7271315318062278	104948
100	36101768	104947	0.8572898701249202	0.9230754571355065	0.7439469446482511	104947
100	36101768	104948	0.8398349658878683	0.9188741090825933	0.7277604146815566	104948
100	36206715	104947	0.8570897691215565	0.9221511810723508	0.7440041163634977	104947
100	36206715	104948	0.8401208217402905	0.91870259557114	0.7263216068910318	104948
100	36311662	104947	0.8569849543102709	0.9222845817412598	0.7440327022211212	104947
100	36311662	104948	0.8398063803026261	0.9188074093836948	0.7268361474253916	104948
100	36416609	104947	0.8569182539758163	0.9214651204893899	0.7434419278302381	104947
100	36416609	104948	0.8406163052178222	0.9179593703548424	0.7268742615390479	104948
100	36521556	104947	0.8572898701249202	0.9222464672644287	0.7445281904199262	104947
100	36521556	104948	0.8405686625757518	0.9191409078781873	0.7269886038800167	104948
100	36626503	104947	0.857347041840167	0.9227228982248182	0.7427749244856928	104947
100	36626503	104948	0.8405686625757518	0.9188931661394214	0.7270076609368449	104948
100	36731450	104947	0.8568515536413618	0.9216080497775068	0.7431465406347966	104947
100	36731450	104948	0.8400731790982201	0.9183214544345771	0.7275126729427908	104948
100	36836397	104947	0.8579282876118421	0.9216556928735457	0.7429845541082641	104947
100	36836397	104948	0.8411880169226664	0.9182452262072646	0.7279795708350802	104948
100	36941344	104947	0.8574042135554136	0.9216366356351301	0.7441375170324068	104947
100	36941344	104948	0.8398159088310402	0.9181880550367801	0.7267408621412509	104948
100	37046291	104947	0.8573565704593747	0.9211697332939484	0.7429654968698486	104947
100	37046291	104948	0.8396062812059305	0.9185977817585852	0.7268647330106338	104948
100	37151238	104947	0.8568610822605697	0.9224370396485845	0.7437277864064719	104947
100	37151238	104948	0.8403304493654	0.9190075084803903	0.727808057323627	104948
100	37256185	104947	0.8568706108797775	0.9214270060125587	0.7438040153601342	104947
100	37256185	104948	0.8402065784960171	0.9189884514235621	0.7272744597324389	104948
100	37361132	104947	0.8573184559825435	0.9224846827446235	0.7440612880787445	104947
100	37361132	104948	0.8414167016046041	0.9194744063726799	0.728074856119221	104948
100	37466079	104947	0.8571850553136345	0.9220654234994807	0.7434419278302381	104947
100	37466079	104948	0.839930251172009	0.918835994968937	0.7271410603346419	104948
100	37571026	104947	0.8576424290356085	0.9217605076848314	0.7421841500948098	104947
100	37571026	104948	0.8404257346495407	0.9180737126958113	0.7265121774593132	104948
100	37675973	104947	0.8577377152276864	0.9217986221616625	0.7433656988765758	104947
100	37675973	104948	0.8409688607691428	0.9178545565422876	0.7278556999656973	104948
100	37780920	104947	0.8573184559825435	0.9220177804034417	0.7431560692540043	104947
100	37780920	104948	0.840197049967603	0.9192076075770858	0.726616991271868	104948
100	37885867	104947	0.8572612842672969	0.921598521158299	0.7454143520062507	104947
100	37885867	104948	0.8400827076266342	0.9183500400198193	0.727941456721424	104948
100	37990814	104947	0.857575728701154	0.9220177804034417	0.7435562712607316	104947
100	37990814	104948	0.8407973472576895	0.918302397377749	0.7267885047833212	104948
100	38095761	104947	0.8576138431779851	0.9216366356351301	0.743765900883303	104947
100	38095761	104948	0.8409688607691428	0.9184548538323741	0.7280081564203225	104948
100	38200708	104947	0.8574137421746215	0.922008251784234	0.7432132409692511	104947
100	38200708	104948	0.8406734763883066	0.9193124213896405	0.7278842855509395	104948
100	38305655	104947	0.8570611832639332	0.9217986221616625	0.7437277864064719	104947
100	38305655	104948	0.8402542211380875	0.9189503373099058	0.7278461714372833	104948
100	38410602	104947	0.8573565704593747	0.9211983191515718	0.743632500214394	104947
100	38410602	104948	0.8405210199336814	0.9183500400198193	0.727550787056447	104948
100	38515549	104947	0.8580426310423357	0.9217509790656236	0.7435562712607316	104947
100	38515549	104948	0.8412356595647368	0.9180260700537409	0.7273602164881656	104948
100	38620496	104947	0.8567276815916606	0.9216842787311691	0.7430798403003421	104947
100	38620496	104948	0.8403495064222282	0.9185120250028586	0.7263883065899303	104948
100	38725443	104947	0.8570897691215565	0.9216556928735457	0.7434133419726148	104947
100	38725443	104948	0.8398540229446965	0.9177116286160766	0.7280462705339787	104948
100	38830390	104947	0.8579664020886734	0.9216938073503769	0.7430417258235109	104947
100	38830390	104948	0.8403876205358844	0.9177592712581469	0.7267789762549072	104948
100	38935337	104947	0.8582617892841148	0.9220749521186885	0.742717752770446	104947
100	38935337	104948	0.8407878187292754	0.918435796775546	0.7271315318062278	104948
100	39040284	104947	0.8577091293700629	0.9211125615787016	0.7439374160290433	104947
100	39040284	104948	0.8404447917063689	0.917635400388764	0.7281606128749476	104948
100	39145231	104947	0.8576329004164006	0.921141147436325	0.7434609850686537	104947
100	39145231	104948	0.8410832031101116	0.9174257727636543	0.7272458741471967	104948
100	39250178	104947	0.8572136411712579	0.9212459622476107	0.7434228705918225	104947
100	39250178	104948	0.8398159088310402	0.9180546556389831	0.727808057323627	104948
100	39355125	104947	0.8579473448502577	0.9210839757210783	0.7429273823930175	104947
100	39355125	104948	0.8408926325418302	0.9184262682471319	0.7275031444143767	104948
100	39460072	104947	0.8581569744728291	0.9218748511153249	0.743537214022316	104947
100	39460072	104948	0.8401017646834623	0.9183309829629912	0.7279223996645958	104948
100	39565019	104947	0.8572898701249202	0.9220940093571041	0.7430226685850954	104947
100	39565019	104948	0.8402161070244312	0.9186930670427259	0.728351183443229	104948
100	39669966	104947	0.8570802405023488	0.9216938073503769	0.7431560692540043	104947
100	39669966	104948	0.8397206235468995	0.918969394366734	0.7276079582269315	104948
100	39774913	104947	0.8581665030920369	0.9211220901979095	0.742670109674407	104947
100	39774913	104948	0.8402065784960171	0.9178926706559439	0.7272172885619544	104948
100	39879860	104947	0.8580331024231279	0.9213984201549353	0.7424985945286668	104947
100	39879860	104948	0.8399588367572512	0.9183214544345771	0.7281701414033617	104948
100	39984807	104947	0.8578997017542188	0.9215604066814678	0.7422984935253032	104947
100	39984807	104948	0.8412261310363227	0.9186930670427259	0.7272268170903685	104948
100	40089754	104947	0.856889668118193	0.9209219891945458	0.7425081231478746	104947
100	40089754	104948	0.8399397797004231	0.9177497427297329	0.7279128711361817	104948
100	40194701	104947	0.8574709138898682	0.9204646154725719	0.7427558672472772	104947
100	40194701	104948	0.8406448908030644	0.917768799786561	0.7272172885619544	104948
100	40299648	104947	0.8577186579892708	0.920598016141481	0.7423270793829266	104947
100	40299648	104948	0.8399969508709075	0.9184929679460304	0.7273125738460952	104948
100	40404595	104947	0.8576519576548163	0.9219510800689872	0.7420507494259007	104947
100	40404595	104948	0.8400350649845638	0.9190551511224606	0.7267408621412509	104948
100	40509542	104947	0.8580426310423357	0.9210077467674159	0.7430321972043031	104947
100	40509542	104948	0.8413785874909478	0.9183405114914053	0.7286275107672371	104948
100	40614489	104947	0.8571564694560111	0.9216652214927535	0.7430131399658876	104947
100	40614489	104948	0.8405591340473377	0.9182928688493349	0.727550787056447	104948
100	40719436	104947	0.8575185569859072	0.9209886895290004	0.7427749244856928	104947
100	40719436	104948	0.8396920379616573	0.9181022982810535	0.7270076609368449	104948
100	40824383	104947	0.8572231697904656	0.9213317198204808	0.7411550592203684	104947
100	40824383	104948	0.8399778938140794	0.9180927697526394	0.7262358501353051	104948
100	40929330	104947	0.8580140451847122	0.9210744471018705	0.7425557662439136	104947
100	40929330	104948	0.8393871250524069	0.9177402142013188	0.727407859130236	104948
100	41034277	104947	0.8569658970718553	0.9216175783967145	0.742584352101537	104947
100	41034277	104948	0.8402542211380875	0.9184167397187178	0.7270362465220871	104948
100	41139224	104947	0.8572041125520501	0.9213317198204808	0.7433180557805369	104947
100	41139224	104948	0.8397396806037276	0.9179498418264284	0.7289800663185577	104948
100	41244171	104947	0.8574137421746215	0.9207504740488056	0.7428797392969785	104947
100	41244171	104948	0.8402828067233297	0.9180451271105691	0.7273221023745092	104948
100	41349118	104947	0.8570040115486864	0.9214651204893899	0.7446234766120041	104947
100	41349118	104948	0.8391107977283988	0.9181975835651942	0.7281701414033617	104948
100	41454065	104947	0.8580140451847122	0.9214460632509743	0.7430036113466798	104947
100	41454065	104948	0.8402446926096734	0.9183214544345771	0.7274555017723063	104948
100	41559012	104947	0.8574232707938293	0.9211983191515718	0.7432418268268746	104947
100	41559012	104948	0.8402732781949156	0.9181594694515379	0.7269504897663605	104948
100	41663959	104947	0.857070711883141	0.9215699353006755	0.7441851601284458	104947
100	41663959	104948	0.8396920379616573	0.9184739108892023	0.727550787056447	104948
100	41768906	104947	0.8568229677837385	0.9211602046747406	0.7418887628993682	104947
100	41768906	104948	0.8390155124442581	0.9181213553378816	0.7265121774593132	104948
100	41873853	104947	0.857480442509076	0.9210077467674159	0.7417934767072903	104947
100	41873853	104948	0.8390440980295003	0.9184929679460304	0.7271696459198841	104948
100	41978800	104947	0.8568039105453229	0.9204265009957407	0.742946439631433	104947
100	41978800	104948	0.8398254373594543	0.918035598582155	0.7270457750505012	104948
100	42083747	104947	0.8582808465225304	0.9213603056781042	0.7426415238167837	104947
100	42083747	104948	0.8412833022068071	0.9181975835651942	0.7261691504364066	104948
100	42188694	104947	0.8578997017542188	0.9210839757210783	0.7430703116811342	104947
100	42188694	104948	0.8396158097343446	0.9178926706559439	0.7273888020734078	104948
100	42293641	104947	0.8565752236843359	0.9218176794000781	0.7432418268268746	104947
100	42293641	104948	0.839272782711438	0.9181785265083661	0.726226321606891	104948
100	42398588	104947	0.8572422270288812	0.9210172753866237	0.7430703116811342	104947
100	42398588	104948	0.8401589358539467	0.9181308838662957	0.7273316309029233	104948
100	42503535	104947	0.8580140451847122	0.9212554908668185	0.7437182577872641	104947
100	42503535	104948	0.8407211190303769	0.9171875595533026	0.7279890993634943	104948
100	42608482	104947	0.8581093313767902	0.9211697332939484	0.7428035103433162	104947
100	42608482	104948	0.8399588367572512	0.917768799786561	0.7270934176925715	104948
100	42713429	104947	0.8576519576548163	0.9207600026680134	0.7424890659094591	104947
100	42713429	104948	0.8388535274612189	0.9185977817585852	0.7282368411022602	104948
100	42818376	104947	0.8562131361544398	0.9205408444262342	0.7425176517670824	104947
100	42818376	104948	0.8389106986317033	0.9182071120936083	0.7266646339139383	104948
100	42923323	104947	0.857842530038972	0.9209791609097926	0.7437373150256796	104947
100	42923323	104948	0.8395681670922742	0.9174543583488965	0.7271410603346419	104948
100	43028270	104947	0.8574232707938293	0.9210649184826627	0.7431179547771732	104947
100	43028270	104948	0.8396539238480009	0.9180070129969128	0.728208255517018	104948
100	43133217	104947	0.8572612842672969	0.9206361306183121	0.7430131399658876	104947
100	43133217	104948	0.8402446926096734	0.9179498418264284	0.7278747570225255	104948
100	43238164	104947	0.857575728701154	0.9209696322905848	0.7428892679161863	104947
100	43238164	104948	0.8393585394671647	0.918035598582155	0.7277318290963144	104948
100	43343111	104947	0.8573375132209592	0.9210077467674159	0.7437754295025107	104947
100	43343111	104948	0.8402732781949156	0.9177116286160766	0.7276174867553455	104948
100	43448058	104947	0.8573756276977903	0.921369834297312	0.7436515574528095	104947
100	43448058	104948	0.8397777947173839	0.9177878568433891	0.7290562945458703	104948
100	43553005	104947	0.8571088263599722	0.9210839757210783	0.7435181567839004	104947
100	43553005	104948	0.8398254373594543	0.9179212562411861	0.727684186454244	104948
100	43657952	104947	0.8576900721316474	0.9208838747177146	0.7442042173668614	104947
100	43657952	104948	0.8399778938140794	0.9175972862751077	0.7280939131760491	104948
100	43762899	104947	0.8567467388300761	0.921779564923247	0.74317512649242	104947
100	43762899	104948	0.8385676716087967	0.9182928688493349	0.7269314327095323	104948
100	43867846	104947	0.8582236748072837	0.9202645144692083	0.7433752274957836	104947
100	43867846	104948	0.8400255364561497	0.9181308838662957	0.7271887029767122	104948
100	43972793	104947	0.857890173135011	0.9216271070159223	0.7433180557805369	104947
100	43972793	104948	0.8405210199336814	0.9185024964744445	0.7279795708350802	104948
100	44077740	104947	0.8570611832639332	0.920912460575338	0.7434990995454849	104947
100	44077740	104948	0.8398159088310402	0.9181594694515379	0.7283797690284712	104948
100	44182687	104947	0.8575471428435305	0.9209219891945458	0.7432418268268746	104947
100	44182687	104948	0.8395967526775164	0.9186454244006556	0.7287799672218622	104948
100	44287634	104947	0.8579854593270889	0.9206742450951433	0.7436706146912251	104947
100	44287634	104948	0.8402351640812593	0.9172733163090292	0.7279319281930099	104948
100	44392581	104947	0.8575566714627383	0.9205122585686109	0.7425652948631214	104947
100	44392581	104948	0.8402732781949156	0.9177306856729047	0.7269123756527042	104948
100	44497528	104947	0.8574709138898682	0.9208076457640524	0.7448521634729911	104947
100	44497528	104948	0.840463848763197	0.9185977817585852	0.7285703395967527	104948
100	44602475	104947	0.8575566714627383	0.9211887905323639	0.7440327022211212	104947
100	44602475	104948	0.8400445935129779	0.9185501391165148	0.7274840873575485	104948
100	44707422	104947	0.8584618902874784	0.9215318208238444	0.7453000085757573	104947
100	44707422	104948	0.8402828067233297	0.9184739108892023	0.7293040362846362	104948
100	44812369	104947	0.8570992977407644	0.9209982181482081	0.74399458774429	104947
100	44812369	104948	0.8397301520753135	0.9185596676449289	0.7282273125738461	104948
100	44917316	104947	0.8577663010853097	0.921093504340286	0.7432513554460823	104947
100	44917316	104948	0.8401684643823608	0.9183309829629912	0.7278938140793536	104948
100	45022263	104947	0.8563655940617645	0.9207981171448445	0.7423366080021344	104947
100	45022263	104948	0.8399111941151809	0.9182356976788505	0.7278556999656973	104948
100	45127210	104947	0.8559749206742451	0.9211983191515718	0.7429750254890564	104947
100	45127210	104948	0.8393775965239928	0.9184548538323741	0.7279605137782521	104948
100	45232157	104947	0.8567753246876995	0.9202931003268316	0.7451189648108093	104947
100	45232157	104948	0.8395205244502039	0.9179498418264284	0.7283702405000572	104948
100	45337104	104947	0.8578806445158033	0.9207504740488056	0.7454143520062507	104947
100	45337104	104948	0.8406067766894081	0.9179974844684987	0.7293612074551207	104948
100	45442051	104947	0.8568324964029462	0.921827208019286	0.7435086281646927	104947
100	45442051	104948	0.8405114914052674	0.918712124099554	0.7281320272897054	104948
100	45546998	104947	0.8578139441813487	0.9212840767244419	0.743356170257368	104947
100	45546998	104948	0.840320920836986	0.9181022982810535	0.727150588863056	104948
100	45651945	104947	0.857575728701154	0.9207028309527666	0.7447378200424977	104947
100	45651945	104948	0.8402828067233297	0.9178545565422876	0.7280462705339787	104948
100	45756892	104947	0.8588335064365823	0.921731921827208	0.7442899749397315	104947
100	45756892	104948	0.8410165034112131	0.9190551511224606	0.7285322254830964	104948
100	45861839	104947	0.8565466378267125	0.920912460575338	0.7440136449827055	104947
100	45861839	104948	0.8392632541830239	0.9179974844684987	0.7286465678240652	104948
100	45966786	104947	0.8574042135554136	0.9211602046747406	0.745490580959913	104947
100	45966786	104948	0.8399683652856653	0.9183119259061631	0.7284464687273697	104948
100	46071733	104947	0.8574709138898682	0.9209886895290004	0.7440041163634977	104947
100	46071733	104948	0.8395776956206883	0.9179498418264284	0.7280557990623928	104948
100	46176680	104947	0.8578806445158033	0.920416972376533	0.7446139479927963	104947
100	46176680	104948	0.840463848763197	0.9178640850707017	0.7284464687273697	104948
100	46281627	104947	0.8573279846017514	0.9202931003268316	0.744175631509238	104947
100	46281627	104948	0.8419502991957922	0.9172637877806151	0.7288943095628311	104948
100	46386574	104947	0.857661486274024	0.9201978141347538	0.7452618940989261	104947
100	46386574	104948	0.8400160079277357	0.9171208598544041	0.7290658230742844	104948
100	46491521	104947	0.8580521596615435	0.9203598006612862	0.744175631509238	104947
100	46491521	104948	0.8404829058200252	0.917635400388764	0.7285608110683386	104948
100	46596468	104947	0.8577091293700629	0.9209410464329614	0.7442518604629004	104947
100	46596468	104948	0.8404733772916111	0.9181118268094676	0.7282463696306742	104948
100	46701415	104947	0.8563465368233489	0.9200548848466369	0.7450903789531859	104947
100	46701415	104948	0.8397301520753135	0.9176544574455922	0.7273221023745092	104948
100	46806362	104947	0.8567467388300761	0.9205789589030654	0.7441089311747835	104947
100	46806362	104948	0.8402065784960171	0.9183405114914053	0.7270934176925715	104948
100	46911309	104947	0.8574899711282838	0.9203598006612862	0.7444710187046795	104947
100	46911309	104948	0.8403685634790563	0.9173590730647558	0.7278366429088692	104948
100	47016256	104947	0.8563465368233489	0.9206266019991043	0.7454524664830819	104947
100	47016256	104948	0.8398349658878683	0.9179879559400846	0.7287799672218622	104948
100	47121203	104947	0.8576138431779851	0.9210077467674159	0.744995092761108	104947
100	47121203	104948	0.8405305484620955	0.9174353012920684	0.7264359492320006	104948
100	47226150	104947	0.8573184559825435	0.9206551878567277	0.7444424328470561	104947
100	47226150	104948	0.8407782902008614	0.9179498418264284	0.7274936158859626	104948
100	47331097	104947	0.85638465130018	0.9208362316216757	0.7438993015522121	104947
100	47331097	104948	0.8391965544841254	0.9179879559400846	0.72767465792583	104948
100	47436044	104947	0.8573851563169981	0.9202740430884161	0.7442423318436925	104947
100	47436044	104948	0.8403399778938141	0.9171875595533026	0.7273221023745092	104948
100	47540991	104947	0.8576424290356085	0.9206933023335588	0.7443185607973548	104947
100	47540991	104948	0.8413118877920494	0.9169969889850211	0.7275603155848611	104948
100	47645938	104947	0.856937311214232	0.9206456592375198	0.7438897729330043	104947
100	47645938	104948	0.8411022601669398	0.9177402142013188	0.7286846819377215	104948
100	47750885	104947	0.8571278835983878	0.9210363326250393	0.7449855641419002	104947
100	47750885	104948	0.8395776956206883	0.9180737126958113	0.7291992224720815	104948
100	47855832	104947	0.8577948869429332	0.9210649184826627	0.7458907829666404	104947
100	47855832	104948	0.8399683652856653	0.9175972862751077	0.7282273125738461	104948
100	47960779	104947	0.8558034055285049	0.9209601036713769	0.7434323992110303	104947
100	47960779	104948	0.8393490109387506	0.9181118268094676	0.7281224987612913	104948
100	48065726	104947	0.8573565704593747	0.9206361306183121	0.7448998065690301	104947
100	48065726	104948	0.8403495064222282	0.9169969889850211	0.728084384647635	104948
100	48170673	104947	0.8571088263599722	0.920140642419507	0.7444329042278484	104947
100	48170673	104948	0.8390917406715707	0.9177402142013188	0.7276365438121737	104948
100	48275620	104947	0.858252260664907	0.9210172753866237	0.7450522644763548	104947
100	48275620	104948	0.8409021610702443	0.9175591721614513	0.72767465792583	104948
100	48380567	104947	0.856479937492258	0.9206361306183121	0.745271422718134	104947
100	48380567	104948	0.8403780920074704	0.9176068148035218	0.7293516789267066	104948
100	48485514	104947	0.8564989947306736	0.9202454572307927	0.7442804463205237	104947
100	48485514	104948	0.8402637496665015	0.9176639859740062	0.7282654266875024	104948
100	48590461	104947	0.8566419240187905	0.9202645144692083	0.7448998065690301	104947
100	48590461	104948	0.8400350649845638	0.9171303883828181	0.7284655257841979	104948
100	48695408	104947	0.8566895671148294	0.9200548848466369	0.7462623991157441	104947
100	48695408	104948	0.8395776956206883	0.9162251781834814	0.7299900903304494	104948
100	48800355	104947	0.8565656950651281	0.9213984201549353	0.7458717257282247	104947
100	48800355	104948	0.8402065784960171	0.9182261691504364	0.7287609101650341	104948
100	48905302	104947	0.8573279846017514	0.9208743460985068	0.7456430388672377	104947
100	48905302	104948	0.8400445935129779	0.9179498418264284	0.7288276098639326	104948
100	49010249	104947	0.8570611832639332	0.9207504740488056	0.7458240826321858	104947
100	49010249	104948	0.8407211190303769	0.9175591721614513	0.7295613065518162	104948
100	49115196	104947	0.8569277825950241	0.9205217871878186	0.7451284934300171	104947
100	49115196	104948	0.8399588367572512	0.9174734154057247	0.7283702405000572	104948
100	49220143	104947	0.8564418230154268	0.9209601036713769	0.7452523654797183	104947
100	49220143	104948	0.8397682661889698	0.9178926706559439	0.7273316309029233	104948
100	49325090	104947	0.8563655940617645	0.9201692282771303	0.7459003115858481	104947
100	49325090	104948	0.8390250409726722	0.9175305865762091	0.7284655257841979	104948
100	49430037	104947	0.8560035065318685	0.9214174773933509	0.744356675274186	104947
100	49430037	104948	0.8387868277623204	0.9182833403209208	0.7279128711361817	104948
100	49534984	104947	0.8568991967374008	0.920598016141481	0.7449283924266534	104947
100	49534984	104948	0.839920722643595	0.9175496436330373	0.7287609101650341	104948
100	49639931	104947	0.8571659980752189	0.9213412484396886	0.7459670119203027	104947
100	49639931	104948	0.8404829058200252	0.9179212562411861	0.729942447688379	104948
100	49744878	104947	0.856117849962362	0.920550373045442	0.7455572812943676	104947
100	49744878	104948	0.8405114914052674	0.9171494454396463	0.7281415558181195	104948
100	49849825	104947	0.8562703078696866	0.9205122585686109	0.7446901769464587	104947
100	49849825	104948	0.840063650569806	0.9178640850707017	0.7281796699317757	104948
100	49954772	104947	0.8550030015150505	0.9199405414161433	0.7449569782842769	104947
100	49954772	104948	0.8395109959217898	0.9171589739680603	0.7290086519037999	104948
100	50059719	104947	0.8566419240187905	0.9205599016646497	0.7458621971090169	104947
100	50059719	104948	0.8406258337462362	0.9177211571444907	0.7291611083584251	104948
100	50164666	104947	0.856251250631271	0.9197785548896109	0.7448998065690301	104947
100	50164666	104948	0.8404447917063689	0.9168064184167397	0.7289419522049014	104948
100	50269613	104947	0.8558224627669204	0.9201025279426758	0.7452523654797183	104947
100	50269613	104948	0.8393966535808209	0.9175401151046232	0.7281129702328772	104948
100	50374560	104947	0.8559177489589983	0.9195117535517928	0.7454524664830819	104947
100	50374560	104948	0.839920722643595	0.9169493463429508	0.7284941113694401	104948
100	50479507	104947	0.8568801394989852	0.9203979151381173	0.7448712207114067	104947
100	50479507	104948	0.8408164043145177	0.9178640850707017	0.7280462705339787	104948
100	50584454	104947	0.8560320923894918	0.9198166693664421	0.7444614900854717	104947
100	50584454	104948	0.8401684643823608	0.9175020009909669	0.7281224987612913	104948
100	50689401	104947	0.8557462338132581	0.9198928983201045	0.7440994025555757	104947
100	50689401	104948	0.8397396806037276	0.9173209589510996	0.7286275107672371	104948
100	50794348	104947	0.8571088263599722	0.9199881845121823	0.7455191668175365	104947
100	50794348	104948	0.8416835004001982	0.9169302892861226	0.7289419522049014	104948
100	50899295	104947	0.8562417220120633	0.9208934033369225	0.745042735857147	104947
100	50899295	104948	0.8403590349506422	0.9183119259061631	0.7285989251819949	104948
100	51004242	104947	0.855841520005336	0.920416972376533	0.7442899749397315	104947
100	51004242	104948	0.8396348667911727	0.9168921751724664	0.7284559972557838	104948
100	51109189	104947	0.8568134391645307	0.9208362316216757	0.7447568772809132	104947
100	51109189	104948	0.8401589358539467	0.9174734154057247	0.7275793726416893	104948
100	51214136	104947	0.8565752236843359	0.920321686184455	0.7457287964401078	104947
100	51214136	104948	0.8403590349506422	0.9173590730647558	0.7287323245797919	104948
100	51319083	104947	0.855022058753466	0.9199500700353512	0.7438516584561731	104947
100	51319083	104948	0.839796851774212	0.9180737126958113	0.728341654914815	104948
100	51424030	104947	0.8550030015150505	0.9197023259359486	0.7455001095791209	104947
100	51424030	104948	0.8391774974272973	0.9172733163090292	0.7288561954491748	104948
100	51528977	104947	0.8564418230154268	0.919730911793572	0.7447282914232899	104947
100	51528977	104948	0.8409021610702443	0.9168350040019819	0.7286656248808934	104948
100	51633924	104947	0.8572422270288812	0.9196451542207019	0.7453381230525884	104947
100	51633924	104948	0.8405019628768533	0.9175305865762091	0.7291801654152533	104948
100	51738871	104947	0.856251250631271	0.9205599016646497	0.7447568772809132	104947
100	51738871	104948	0.8408354613713458	0.9173209589510996	0.7276937149826581	104948
100	51843818	104947	0.8560225637702841	0.9190448512106111	0.7452428368605105	104947
100	51843818	104948	0.8399493082288372	0.9164157487517628	0.7281224987612913	104948
100	51948765	104947	0.8560130351510763	0.9204741440917796	0.7449760355226924	104947
100	51948765	104948	0.8388344704043907	0.917378130121584	0.7275126729427908	104948
100	52053712	104947	0.8562703078696866	0.9186637064422994	0.744995092761108	104947
100	52053712	104948	0.8404829058200252	0.9156439379502229	0.7287323245797919	104948
100	52158659	104947	0.8562798364888944	0.9198452552240655	0.7453476516717963	104947
100	52158659	104948	0.8399874223424935	0.9162918778823799	0.7293707359835347	104948
100	52263606	104947	0.856842025022154	0.9200072417505979	0.7435943857375628	104947
100	52263606	104948	0.8397587376605558	0.9171494454396463	0.7281701414033617	104948
100	52368553	104947	0.8559177489589983	0.920912460575338	0.7433371130189524	104947
100	52368553	104948	0.8401112932118764	0.9176068148035218	0.7288752525060029	104948
100	52473500	104947	0.8559844492934529	0.9207218881911822	0.7435753284991472	104947
100	52473500	104948	0.8408259328429317	0.9174829439341388	0.7283797690284712	104948
100	52578447	104947	0.8564227657770113	0.9194069387405072	0.7444043183702249	104947
100	52578447	104948	0.8396920379616573	0.916577733734802	0.7277508861531425	104948
100	52683394	104947	0.8566228667803748	0.9197785548896109	0.7461385270660429	104947
100	52683394	104948	0.8393585394671647	0.9166158478484583	0.7296470633075428	104948
100	52788341	104947	0.8558986917205827	0.9200834707042602	0.7447378200424977	104947
100	52788341	104948	0.8403495064222282	0.9163871631665206	0.7283892975568853	104948
100	52893288	104947	0.8561845502968165	0.9199786558929746	0.7446615910888353	104947
100	52893288	104948	0.8409402751839006	0.9171208598544041	0.7288371383923467	104948
100	52998235	104947	0.856842025022154	0.9199024269393122	0.7455191668175365	104947
100	52998235	104948	0.8409498037123146	0.9171208598544041	0.7295231924381599	104948
100	53103182	104947	0.8564513516346346	0.920369329280494	0.7438421298369653	104947
100	53103182	104948	0.8407211190303769	0.9173019018942714	0.7283225978579868	104948
100	53208129	104947	0.8557271765748425	0.9198166693664421	0.7442328032244847	104947
100	53208129	104948	0.8398254373594543	0.9169874604566071	0.7286560963524793	104948
100	53313076	104947	0.857299398744128	0.9200453562274291	0.7453667089102118	104947
100	53313076	104948	0.8408259328429317	0.9165586766779739	0.7293135648130503	104948
100	53418023	104947	0.856070206866323	0.919187780498728	0.7446425338504198	104947
100	53418023	104948	0.8409879178259709	0.9164729199222472	0.7284178831421275	104948
100	53522970	104947	0.8571850553136345	0.9193974101212993	0.7437754295025107	104947
100	53522970	104948	0.8407687616724473	0.9163490490528643	0.7284941113694401	104948
100	53627917	104947	0.856432294396219	0.9203121575652472	0.7453762375294196	104947
100	53627917	104948	0.8414167016046041	0.9177116286160766	0.727950985249838	104948
100	53732864	104947	0.8565752236843359	0.9193116525484292	0.7447378200424977	104947
100	53732864	104948	0.8401398787971186	0.9165205625643176	0.7290277089606281	104948
100	53837811	104947	0.8567848533069073	0.9196451542207019	0.7458526684898091	104947
100	53837811	104948	0.8407782902008614	0.9170446316270915	0.7294946068529177	104948
100	53942758	104947	0.8552412169952452	0.9202740430884161	0.7452047223836794	104947
100	53942758	104948	0.8405114914052674	0.9162823493539658	0.7283607119716431	104948
100	54047705	104947	0.8556223617635569	0.9197213831743642	0.7462528704965363	104947
100	54047705	104948	0.8396825094332432	0.9166444334337005	0.7294755497960895	104948
100	54152652	104947	0.8554889610946478	0.9199881845121823	0.7452523654797183	104947
100	54152652	104948	0.8403304493654	0.9166063193200442	0.7284369401989557	104948
100	54257599	104947	0.8563179509657256	0.9205122585686109	0.7446615910888353	104947
100	54257599	104948	0.8405496055189237	0.9176925715592484	0.728884781034417	104948
100	54362546	104947	0.8558701058629594	0.9200453562274291	0.7449760355226924	104947
100	54362546	104948	0.8400160079277357	0.9171303883828181	0.7281891984601898	104948
100	54467493	104947	0.856070206866323	0.9210839757210783	0.746224284638913	104947
100	54467493	104948	0.8405210199336814	0.9166634904905286	0.730075847086176	104948
100	54572440	104947	0.8569563684526476	0.9194831676941695	0.7446330052312119	104947
100	54572440	104948	0.8421218127072455	0.9163299919960362	0.7287132675229637	104948
100	54677387	104947	0.8559463348166217	0.9202263999923771	0.7447473486617054	104947
100	54677387	104948	0.8401017646834623	0.9165300910927316	0.7280653275908069	104948
100	54782334	104947	0.855708119336427	0.9210363326250393	0.7448616920921989	104947
100	54782334	104948	0.8400922361550482	0.9172828448374433	0.7290658230742844	104948
100	54887281	104947	0.8552126311376218	0.9205789589030654	0.7448902779498223	104947
100	54887281	104948	0.8398159088310402	0.9170351030986774	0.7281606128749476	104948
100	54992228	104947	0.8563370082041412	0.9202359286115849	0.7437087291680563	104947
100	54992228	104948	0.8402351640812593	0.9159583793878874	0.7287227960513778	104948
100	55097175	104947	0.8565180519690891	0.920598016141481	0.7451666079068482	104947
100	55097175	104948	0.840454320234783	0.9166730190189427	0.7293421503982925	104948
100	55202122	104947	0.8548791294653492	0.9199691272737668	0.7453095371949651	104947
100	55202122	104948	0.8392441971261958	0.9162347067118954	0.7281891984601898	104948
100	55307069	104947	0.8550030015150505	0.9207600026680134	0.7446330052312119	104947
100	55307069	104948	0.8403113923085719	0.9165396196211457	0.7283797690284712	104948
100	55412016	104947	0.8560511496279074	0.9208362316216757	0.7450522644763548	104947
100	55412016	104948	0.840197049967603	0.9173590730647558	0.7287323245797919	104948
100	55516963	104947	0.855936806197414	0.9201692282771303	0.7439660018866666	104947
100	55516963	104948	0.8401112932118764	0.9169398178145367	0.7284941113694401	104948
100	55621910	104947	0.8555556614291023	0.9208267030024679	0.745176136526056	104947
100	55621910	104948	0.840463848763197	0.9165586766779739	0.7286656248808934	104948
100	55726857	104947	0.856117849962362	0.9201215851810914	0.7454715237214975	104947
100	55726857	104948	0.8413023592636353	0.9175591721614513	0.7284559972557838	104948
100	55831804	104947	0.8560606782471152	0.9206933023335588	0.7447378200424977	104947
100	55831804	104948	0.8407211190303769	0.9171208598544041	0.7285226969546823	104948
100	55936751	104947	0.8557557624324659	0.9202931003268316	0.7452047223836794	104947
100	55936751	104948	0.8405781911041659	0.9167206616610131	0.7285226969546823	104948
100	56041698	104947	0.8558986917205827	0.9200262989890134	0.7452809513373417	104947
100	56041698	104948	0.8407020619735488	0.9166444334337005	0.7287227960513778	104948
100	56146645	104947	0.8557462338132581	0.9206647164759355	0.7443280894165627	104947
100	56146645	104948	0.8401303502687045	0.9167397187178412	0.7278271143804551	104948
100	56251592	104947	0.8551649880415829	0.920321686184455	0.7439564732674588	104947
100	56251592	104948	0.8389488127453596	0.9163204634676221	0.7284655257841979	104948
100	56356539	104947	0.8565466378267125	0.921141147436325	0.7453762375294196	104947
100	56356539	104948	0.8399016655867668	0.9164824484506613	0.7279795708350802	104948
100	56461486	104947	0.8559082203397905	0.9199691272737668	0.7433275843997447	104947
100	56461486	104948	0.8402637496665015	0.9166920760757709	0.7281606128749476	104948
100	56566433	104947	0.856889668118193	0.9201215851810914	0.7445567762775496	104947
100	56566433	104948	0.8409593322407287	0.9168635895872241	0.7283321263864009	104948
100	56671380	104947	0.8563179509657256	0.9208457602408835	0.744995092761108	104947
100	56671380	104948	0.8414452871898463	0.9177021000876625	0.7288276098639326	104948
100	56776327	104947	0.8567943819261151	0.9201501710387148	0.7444424328470561	104947
100	56776327	104948	0.8418645424400656	0.9168731181156382	0.7283035408011587	104948
100	56881274	104947	0.855936806197414	0.9197880835088187	0.74481404899616	104947
100	56881274	104948	0.8402161070244312	0.916177535541411	0.7285989251819949	104948
100	56986221	104947	0.8566038095419592	0.9204741440917796	0.7432227695884589	104947
100	56986221	104948	0.8407973472576895	0.9165682052063879	0.7283035408011587	104948
100	57091168	104947	0.8549839442766348	0.920416972376533	0.7436420288336018	104947
100	57091168	104948	0.8402446926096734	0.9166634904905286	0.7279986278919084	104948
100	57196115	104947	0.8548981867037647	0.9197499690319876	0.7439088301714198	104947
100	57196115	104948	0.8396443953195868	0.9161489499561688	0.7282368411022602	104948
100	57301062	104947	0.8561845502968165	0.9204074437573251	0.7446044193735886	104947
100	57301062	104948	0.8410165034112131	0.9163871631665206	0.7294850783245036	104948
100	57406009	104947	0.8556414190019724	0.9201311138002992	0.7448712207114067	104947
100	57406009	104948	0.840063650569806	0.917254259252201	0.7290944086595266	104948
100	57510956	104947	0.8563179509657256	0.9207123595719744	0.744537719039134	104947
100	57510956	104948	0.841511986888745	0.9174353012920684	0.7293802645119488	104948
100	57615903	104947	0.857070711883141	0.9207314168103901	0.7454715237214975	104947
100	57615903	104948	0.8407687616724473	0.917387658649998	0.7288180813355185	104948
100	57720850	104947	0.8556318903827647	0.9200644134658447	0.744766405900121	104947
100	57720850	104948	0.8405019628768533	0.9168731181156382	0.7284369401989557	104948
100	57825797	104947	0.8564894661114658	0.9209505750521692	0.7434705136878615	104947
100	57825797	104948	0.8400445935129779	0.9179117277127721	0.7277604146815566	104948
100	57930744	104947	0.8559653920550373	0.9207885885256367	0.7448616920921989	104947
100	57930744	104948	0.841130845752182	0.9166253763768724	0.7281891984601898	104948
100	58035691	104947	0.8565942809227515	0.9206837737143511	0.7425557662439136	104947
100	58035691	104948	0.8400922361550482	0.9170636886839196	0.7279795708350802	104948
100	58140638	104947	0.8567943819261151	0.9204550868533641	0.7451189648108093	104947
100	58140638	104948	0.8410546175248694	0.9168159469451538	0.7288943095628311	104948
100	58245585	104947	0.8567657960684917	0.9202549858500004	0.7444043183702249	104947
100	58245585	104948	0.8404162061211267	0.9171589739680603	0.7298185768189961	104948
100	58350532	104947	0.8568610822605697	0.9203121575652472	0.744356675274186	104947
100	58350532	104948	0.8412737736783931	0.9167492472462553	0.7286846819377215	104948
100	58455479	104947	0.8557843482900893	0.9210458612442471	0.7448712207114067	104947
100	58455479	104948	0.8410165034112131	0.9175877577466937	0.729018180432214	104948
100	58560426	104947	0.8570421260255177	0.9206170733798965	0.7444614900854717	104947
100	58560426	104948	0.8413118877920494	0.9170446316270915	0.7285036398978542	104948
100	58665373	104947	0.8565561664459204	0.9207314168103901	0.7441470456516146	104947
100	58665373	104948	0.839920722643595	0.9167778328314975	0.7277890002667988	104948
100	58770320	104947	0.8574232707938293	0.9195308107902084	0.7441375170324068	104947
100	58770320	104948	0.8405781911041659	0.9157868658764341	0.7294946068529177	104948
100	58875267	104947	0.8572041125520501	0.9202454572307927	0.7442995035589393	104947
100	58875267	104948	0.8410165034112131	0.9169017037008804	0.7281891984601898	104948
100	58980214	104947	0.8574709138898682	0.9203312148036628	0.7439469446482511	104947
100	58980214	104948	0.8409021610702443	0.9169207607577086	0.7285417540115104	104948
100	59085161	104947	0.8568801394989852	0.9214555918701821	0.7436229715951861	104947
100	59085161	104948	0.8407592331440332	0.9171780310248885	0.7289228951480733	104948
100	59190108	104947	0.8560987927239464	0.919730911793572	0.7443280894165627	104947
100	59190108	104948	0.8410165034112131	0.9170446316270915	0.7279128711361817	104948
100	59295055	104947	0.8577186579892708	0.9203407434228706	0.7448426348537833	104947
100	59295055	104948	0.842055113008347	0.9164919769790754	0.7289038380912451	104948
100	59400002	104947	0.8558605772437516	0.9203407434228706	0.7449569782842769	104947
100	59400002	104948	0.8409593322407287	0.9167397187178412	0.7294564927392614	104948
100	59504949	104947	0.8568134391645307	0.9203502720420784	0.7452428368605105	104947
100	59504949	104948	0.841397644547776	0.9170351030986774	0.7298185768189961	104948
100	59609896	104947	0.8562607792504788	0.9196070397438707	0.744766405900121	104947
100	59609896	104948	0.8402446926096734	0.9165110340359035	0.7303998170522544	104948
100	59714843	104947	0.8568324964029462	0.9199310127969356	0.7447759345193288	104947
100	59714843	104948	0.8404447917063689	0.9169588748713648	0.728884781034417	104948
100	59819790	104947	0.8575566714627383	0.9200834707042602	0.7449569782842769	104947
100	59819790	104948	0.8411403742805961	0.9170255745702633	0.7294183786256051	104948
100	59924737	104947	0.856479937492258	0.9203026289460394	0.744585362135173	104947
100	59924737	104948	0.8396920379616573	0.9163585775812784	0.7290658230742844	104948
100	60029684	104947	0.8569563684526476	0.9200644134658447	0.7440517594595367	104947
100	60029684	104948	0.8405400769905096	0.9165015055074894	0.7282749552159165	104948
100	60134631	104947	0.8552316883760375	0.919321181167637	0.7443376180357705	104947
100	60134631	104948	0.8397015664900713	0.9158154514616763	0.7286275107672371	104948
100	60239578	104947	0.85711835497918	0.9210268040058315	0.7449093351882379	104947
100	60239578	104948	0.8404066775927126	0.9171399169112322	0.7300853756145901	104948
100	60344525	104947	0.855841520005336	0.919959598654559	0.7442899749397315	104947
100	60344525	104948	0.8411117886953539	0.9161489499561688	0.7288180813355185	104948
100	60449472	104947	0.8568229677837385	0.9193021239292214	0.7450617930955625	104947
100	60449472	104948	0.8408068757861036	0.9161013073140984	0.7292563936425659	104948
100	60554419	104947	0.8561845502968165	0.9200739420850524	0.7446520624696276	104947
100	60554419	104948	0.8411784883942524	0.9166444334337005	0.7293707359835347	104948
100	60659366	104947	0.856889668118193	0.9198357266048577	0.7455191668175365	104947
100	60659366	104948	0.8405877196325799	0.9165110340359035	0.7301330182566604	104948
100	60764313	104947	0.8573756276977903	0.9190448512106111	0.7442518604629004	104947
100	60764313	104948	0.8412166025079086	0.915910736745817	0.7279223996645958	104948
100	60869260	104947	0.8557748196708815	0.9192163663563513	0.7428416248201473	104947
100	60869260	104948	0.839796851774212	0.9162823493539658	0.7295613065518162	104948
100	60974207	104947	0.8563274795849334	0.9193592956444682	0.7432227695884589	104947
100	60974207	104948	0.8411689598658383	0.9150436406601364	0.7282463696306742	104948
100	61079154	104947	0.8573375132209592	0.9209886895290004	0.7449760355226924	104947
100	61079154	104948	0.8420836985935892	0.9168731181156382	0.7294660212676755	104948
100	61184101	104947	0.8568039105453229	0.9199310127969356	0.7441089311747835	104947
100	61184101	104948	0.8406163052178222	0.9163014064107939	0.728484582841026	104948
100	61289048	104947	0.8555366041906868	0.920092999323468	0.7441946887476536	104947
100	61289048	104948	0.8401017646834623	0.9167778328314975	0.7289895948469718	104948
100	61393995	104947	0.85711835497918	0.9207314168103901	0.744766405900121	104947
100	61393995	104948	0.8409116895986584	0.9168064184167397	0.7294469642108473	104948
100	61498942	104947	0.8567848533069073	0.9193497670252604	0.7431560692540043	104947
100	61498942	104948	0.8411689598658383	0.9164157487517628	0.7283702405000572	104948
100	61603889	104947	0.8560035065318685	0.9201025279426758	0.7442899749397315	104947
100	61603889	104948	0.8415024583603309	0.9169684033997789	0.729018180432214	104948
100	61708836	104947	0.8552126311376218	0.9202740430884161	0.7441661028900302	104947
100	61708836	104948	0.8396062812059305	0.9169588748713648	0.729151579830011	104948
100	61813783	104947	0.8555461328098946	0.9195498680286239	0.7433752274957836	104947
100	61813783	104948	0.8408164043145177	0.9153676106262149	0.7292087510004955	104948
100	61918730	104947	0.8559939779126606	0.9193497670252604	0.7444805473238872	104947
100	61918730	104948	0.840063650569806	0.9163681061096924	0.7281510843465335	104948
100	62023677	104947	0.8572231697904656	0.920369329280494	0.7434514564494459	104947
100	62023677	104948	0.8412833022068071	0.9162061211266532	0.7289133666196592	104948
100	62128624	104947	0.8568610822605697	0.9201311138002992	0.7443757325126016	104947
100	62128624	104948	0.841521515417159	0.9171589739680603	0.7310191713991692	104948
100	62233571	104947	0.8567848533069073	0.9198547838432732	0.7455477526751598	104947
100	62233571	104948	0.8410641460532835	0.9160917787856844	0.7287990242786904	104948
100	62338518	104947	0.855708119336427	0.9192068377371435	0.7442899749397315	104947
100	62338518	104948	0.8409021610702443	0.9157392232343636	0.7283702405000572	104948
100	62443465	104947	0.8568706108797775	0.9197404404127798	0.7441565742708224	104947
100	62443465	104948	0.8407020619735488	0.9163871631665206	0.7282273125738461	104948
100	62548412	104947	0.8572041125520501	0.9200072417505979	0.7435467426415238	104947
100	62548412	104948	0.8407592331440332	0.9165300910927316	0.7285893966535808	104948
100	62653359	104947	0.8560987927239464	0.9202168713731693	0.7427463386280694	104947
100	62653359	104948	0.8396539238480009	0.9163395205244502	0.7287132675229637	104948
100	62758306	104947	0.8572136411712579	0.9196165683630785	0.744537719039134	104947
100	62758306	104948	0.8411689598658383	0.9157392232343636	0.7285131684262682	104948
100	62863253	104947	0.8567181529724528	0.9197118545551564	0.7446520624696276	104947
100	62863253	104948	0.8403685634790563	0.9162156496550673	0.7286465678240652	104948
100	62968200	104947	0.8550411159918816	0.9197404404127798	0.7444043183702249	104947
100	62968200	104948	0.8399874223424935	0.9159679079163014	0.7288276098639326	104948
100	63073147	104947	0.8569944829294787	0.9199024269393122	0.7435943857375628	104947
100	63073147	104948	0.8413881160193619	0.9154152532682852	0.7275126729427908	104948
100	63178094	104947	0.8574137421746215	0.9196165683630785	0.7438516584561731	104947
100	63178094	104948	0.8417025574570264	0.9157011091207073	0.7281320272897054	104948
100	63283041	104947	0.8557271765748425	0.9191496660218967	0.7431084261579655	104947
100	63283041	104948	0.8397682661889698	0.9156058238365666	0.7285322254830964	104948
100	63387988	104947	0.8566990957340371	0.9187589926343773	0.7430226685850954	104947
100	63387988	104948	0.8410165034112131	0.9154628959103556	0.7282463696306742	104948
100	63492935	104947	0.856251250631271	0.9196165683630785	0.7446711197080431	104947
100	63492935	104948	0.8393871250524069	0.9156058238365666	0.7284178831421275	104948
100	63597882	104947	0.8567562674492839	0.919730911793572	0.7442423318436925	104947
100	63597882	104948	0.8404447917063689	0.9155867667797385	0.728484582841026	104948
100	63702829	104947	0.8548981867037647	0.919454581836546	0.7436801433104329	104947
100	63702829	104948	0.8400922361550482	0.9163871631665206	0.729275450699394	104948
100	63807776	104947	0.8552698028528686	0.9202073427539615	0.7450999075723936	104947
100	63807776	104948	0.8399016655867668	0.9164633913938331	0.7291992224720815	104948
100	63912723	104947	0.8561369072007775	0.9198166693664421	0.744356675274186	104947
100	63912723	104948	0.8410165034112131	0.9159393223310592	0.7272077600335404	104948
100	64017670	104947	0.8566419240187905	0.9193307097868448	0.7443852611318094	104947
100	64017670	104948	0.841130845752182	0.9157201661775355	0.7274269161870641	104948
100	64122617	104947	0.8555080183330633	0.9191687232603124	0.7445663048967575	104947
100	64122617	104948	0.840063650569806	0.9163490490528643	0.7276270152837596	104948
100	64227564	104947	0.8574232707938293	0.9199691272737668	0.744309032178147	104947
100	64227564	104948	0.8410260319396272	0.9165396196211457	0.727941456721424	104948
100	64332511	104947	0.8565275805882969	0.9189876794953643	0.7432132409692511	104947
100	64332511	104948	0.8395681670922742	0.9153771391546289	0.7277890002667988	104948
100	64437458	104947	0.8552793314720764	0.9200644134658447	0.7440994025555757	104947
100	64437458	104948	0.8410165034112131	0.9173590730647558	0.7278747570225255	104948
100	64542405	104947	0.8566419240187905	0.9196546828399097	0.7450903789531859	104947
100	64542405	104948	0.8405877196325799	0.9165491481495598	0.7298566909326524	104948
100	64647352	104947	0.855708119336427	0.920092999323468	0.7449379210458612	104947
100	64647352	104948	0.8404257346495407	0.9169112322292945	0.72875138163662	104948
100	64752299	104947	0.8559653920550373	0.9200167703698057	0.7437087291680563	104947
100	64752299	104948	0.8411594313374242	0.9164824484506613	0.7276556008690018	104948
100	64857246	104947	0.8561369072007775	0.9195593966478318	0.7444043183702249	104947
100	64857246	104948	0.8410450889964554	0.9170255745702633	0.7282654266875024	104948
100	64962193	104947	0.856660981257206	0.9193497670252604	0.7440136449827055	104947
100	64962193	104948	0.8412928307352212	0.9155486526660822	0.7289038380912451	104948
100	65067140	104947	0.8564894661114658	0.919140137402689	0.7450236786187313	104947
100	65067140	104948	0.8402542211380875	0.9160346076152	0.7280176849487365	104948
100	65172087	104947	0.8555175469522711	0.9193402384060526	0.7448616920921989	104947
100	65172087	104948	0.8404257346495407	0.9157963944048482	0.7292087510004955	104948
100	65277034	104947	0.856613338161167	0.9194069387405072	0.7447282914232899	104947
100	65277034	104948	0.8411213172237679	0.9163681061096924	0.7279986278919084	104948
100	65381981	104947	0.8560511496279074	0.9204741440917796	0.7443757325126016	104947
100	65381981	104948	0.8404829058200252	0.917387658649998	0.7292373365857377	104948
100	65486928	104947	0.8555080183330633	0.9199405414161433	0.7428892679161863	104947
100	65486928	104948	0.8397015664900713	0.9152246827000038	0.728074856119221	104948
100	65591875	104947	0.8552983887104919	0.9199977131313901	0.7435657998799394	104947
100	65591875	104948	0.8383008728132028	0.9167301901894271	0.7285608110683386	104948
100	65696822	104947	0.8560416210086996	0.9207123595719744	0.7444805473238872	104947
100	65696822	104948	0.8406544193314784	0.9173019018942714	0.7292563936425659	104948
100	65801769	104947	0.8563465368233489	0.9192544808331825	0.7441184597939913	104947
100	65801769	104948	0.8398826085299387	0.9163299919960362	0.7287037389945497	104948
100	65906716	104947	0.855660476240388	0.9188638074456631	0.7434133419726148	104947
100	65906716	104948	0.8403113923085719	0.9165205625643176	0.7282844837443305	104948
100	66011663	104947	0.8551649880415829	0.9199214841777278	0.7420793352835241	104947
100	66011663	104948	0.8381198307733354	0.916844532530396	0.7269219041811182	104948
100	66116610	104947	0.8563274795849334	0.9206647164759355	0.74399458774429	104947
100	66116610	104948	0.8410736745816976	0.9167873613599116	0.7293326218698785	104948
100	66221557	104947	0.8559272775782062	0.9212364336284029	0.7432513554460823	104947
100	66221557	104948	0.8401112932118764	0.9171494454396463	0.7295041353813317	104948
100	66326504	104947	0.8552316883760375	0.9192068377371435	0.7432037123500433	104947
100	66326504	104948	0.8397015664900713	0.9162347067118954	0.7267694477264931	104948
100	66431451	104947	0.856432294396219	0.9197118545551564	0.7437373150256796	104947
100	66431451	104948	0.8405496055189237	0.9160155505583718	0.7270362465220871	104948
100	66536398	104947	0.8570897691215565	0.9204646154725719	0.7434609850686537	104947
100	66536398	104948	0.8409212181270724	0.9163204634676221	0.728084384647635	104948
100	66641345	104947	0.8558701058629594	0.9200453562274291	0.7428892679161863	104947
100	66641345	104948	0.8408735754850021	0.9161489499561688	0.7281034417044632	104948
100	66746292	104947	0.8556700048595958	0.9201215851810914	0.7442613890821081	104947
100	66746292	104948	0.8398063803026261	0.9164824484506613	0.7276270152837596	104948
100	66851239	104947	0.8568801394989852	0.9200739420850524	0.7449188638074457	104947
100	66851239	104948	0.8399016655867668	0.9166063193200442	0.7294279071540192	104948
100	66956186	104947	0.8564608802538424	0.9198166693664421	0.7422889649060954	104947
100	66956186	104948	0.8405686625757518	0.9174257727636543	0.7284655257841979	104948
100	67061133	104947	0.8572517556480891	0.920912460575338	0.7438802443137965	104947
100	67061133	104948	0.8408926325418302	0.9169017037008804	0.7291896939436674	104948
100	67166080	104947	0.8562417220120633	0.9207028309527666	0.7437944867409264	104947
100	67166080	104948	0.8393108968250943	0.9169302892861226	0.7295994206654725	104948
100	67271027	104947	0.8568706108797775	0.9202740430884161	0.7446044193735886	104947
100	67271027	104948	0.8397396806037276	0.9176544574455922	0.7279795708350802	104948
100	67375974	104947	0.8566800384956216	0.9190448512106111	0.7434133419726148	104947
100	67375974	104948	0.8396825094332432	0.9161489499561688	0.7289514807333155	104948
100	67480921	104947	0.855431789379401	0.9210077467674159	0.7425081231478746	104947
100	67480921	104948	0.8392156115409536	0.9171399169112322	0.7269314327095323	104948
100	67585868	104947	0.8558034055285049	0.9209410464329614	0.7437849581217186	104947
100	67585868	104948	0.8397777947173839	0.9171875595533026	0.728084384647635	104948
100	67690815	104947	0.8559272775782062	0.9205217871878186	0.7438802443137965	104947
100	67690815	104948	0.8399683652856653	0.9173019018942714	0.72875138163662	104948
100	67795762	104947	0.8563941799193879	0.9201978141347538	0.7434514564494459	104947
100	67795762	104948	0.8401112932118764	0.917378130121584	0.7283892975568853	104948
100	67900709	104947	0.8571469408368033	0.9203598006612862	0.7438707156945887	104947
100	67900709	104948	0.8406830049167207	0.9168064184167397	0.7284655257841979	104948
100	68005656	104947	0.8565275805882969	0.9209505750521692	0.7432418268268746	104947
100	68005656	104948	0.8400255364561497	0.9171589739680603	0.7270838891641574	104948
100	68110603	104947	0.8560987927239464	0.9202740430884161	0.7438230725985497	104947
100	68110603	104948	0.8394538247513054	0.9160917787856844	0.7288466669207607	104948
100	68215550	104947	0.8561369072007775	0.9209315178137536	0.7426510524359915	104947
100	68215550	104948	0.8398540229446965	0.9172066166101307	0.7285131684262682	104948
100	68320497	104947	0.856117849962362	0.9212650194860262	0.7440994025555757	104947
100	68320497	104948	0.8404162061211267	0.9170351030986774	0.7291706368868393	104948
100	68425444	104947	0.8556795334788035	0.921731921827208	0.7430036113466798	104947
100	68425444	104948	0.8397492091321417	0.9178831421275299	0.7291134657163547	104948
100	68530391	104947	0.856117849962362	0.9210268040058315	0.7435753284991472	104947
100	68530391	104948	0.8402732781949156	0.9172733163090292	0.7295517780234021	104948
100	68635338	104947	0.8565942809227515	0.9211316188171172	0.743537214022316	104947
100	68635338	104948	0.8414357586614323	0.9175305865762091	0.729666120364371	104948
100	68740285	104947	0.8553841462833621	0.9206170733798965	0.7422603790484721	104947
100	68740285	104948	0.8395205244502039	0.9171303883828181	0.7277318290963144	104948
100	68845232	104947	0.8549839442766348	0.9219891945458184	0.7444996045623029	104947
100	68845232	104948	0.8398444944162824	0.9172352021953729	0.7296375347791287	104948
100	68950179	104947	0.8557176479556348	0.9214841777278054	0.7428035103433162	104947
100	68950179	104948	0.8399493082288372	0.9177021000876625	0.7281129702328772	104948
100	69055126	104947	0.8570992977407644	0.9212173763899874	0.7432418268268746	104947
100	69055126	104948	0.8407020619735488	0.9174734154057247	0.7280367420055647	104948
100	69160073	104947	0.8558224627669204	0.9209791609097926	0.7427463386280694	104947
100	69160073	104948	0.840730647558791	0.9175305865762091	0.7272649312040248	104948
100	69265020	104947	0.856432294396219	0.9213126625820652	0.7437468436448874	104947
100	69265020	104948	0.8403113923085719	0.9175020009909669	0.7283035408011587	104948
100	69369967	104947	0.8562321933928554	0.9202168713731693	0.7439183587906276	104947
100	69369967	104948	0.8400255364561497	0.9173114304226855	0.7295613065518162	104948
100	69474914	104947	0.8561750216776087	0.9214841777278054	0.7430036113466798	104947
100	69474914	104948	0.8399111941151809	0.9175877577466937	0.7278366429088692	104948
100	69579861	104947	0.8557557624324659	0.9211602046747406	0.7432989985421212	104947
100	69579861	104948	0.8400827076266342	0.9176068148035218	0.727941456721424	104948
100	69684808	104947	0.8562036075352321	0.9213507770588963	0.7438040153601342	104947
100	69684808	104948	0.8391489118420551	0.9173495445363418	0.7294946068529177	104948
100	69789755	104947	0.8566419240187905	0.9214270060125587	0.7427272813896538	104947
100	69789755	104948	0.8402828067233297	0.9176830430308344	0.72767465792583	104948
100	69894702	104947	0.8558605772437516	0.9211792619131561	0.7430607830619265	104947
100	69894702	104948	0.8400160079277357	0.9176163433319359	0.7279033426077677	104948
100	69999649	104947	0.8565371092075047	0.9219891945458184	0.7437468436448874	104947
100	69999649	104948	0.839672980904829	0.9174734154057247	0.7282273125738461	104948
100	70104596	104947	0.8563370082041412	0.9211602046747406	0.7437944867409264	104947
100	70104596	104948	0.8413404733772916	0.917635400388764	0.7284464687273697	104948
100	70209543	104947	0.8565180519690891	0.9204360296149485	0.7433656988765758	104947
100	70209543	104948	0.8401303502687045	0.9169398178145367	0.7287132675229637	104948
100	70314490	104947	0.8568515536413618	0.9207600026680134	0.7439564732674588	104947
100	70314490	104948	0.8407211190303769	0.9172637877806151	0.7299138621031368	104948
100	70419437	104947	0.8564989947306736	0.9213507770588963	0.7441184597939913	104947
100	70419437	104948	0.8409783892975569	0.9180165415253269	0.7287227960513778	104948
100	70524384	104947	0.8554889610946478	0.9208743460985068	0.7446806483272509	104947
100	70524384	104948	0.8401017646834623	0.9174638868773106	0.7288943095628311	104948
100	70629331	104947	0.8555937759059334	0.9214174773933509	0.7445472476583418	104947
100	70629331	104948	0.8402161070244312	0.9174924724625528	0.7296851774211991	104948
100	70734278	104947	0.8560892641047386	0.9213126625820652	0.7435276854031082	104947
100	70734278	104948	0.840463848763197	0.9173495445363418	0.7291896939436674	104948
100	70839225	104947	0.8559844492934529	0.9202835717076239	0.7441375170324068	104947
100	70839225	104948	0.8405210199336814	0.9168921751724664	0.7300091473872775	104948
100	70944172	104947	0.8560892641047386	0.9220463662610651	0.7439564732674588	104947
100	70944172	104948	0.8402923352517437	0.9178354994854595	0.7291420513015969	104948
100	71049119	104947	0.8575566714627383	0.9222274100260132	0.7439755305058744	104947
100	71049119	104948	0.8402256355528452	0.9184072111903038	0.7285036398978542	104948
100	71154066	104947	0.8556414190019724	0.9209029319561303	0.7441661028900302	104947
100	71154066	104948	0.8400255364561497	0.9170351030986774	0.7298948050463087	104948
100	71259013	104947	0.8562703078696866	0.9218653224961171	0.7445186618007185	104947
100	71259013	104948	0.8406258337462362	0.9180641841673972	0.7284655257841979	104948
100	71363960	104947	0.8564704088730503	0.9217128645887924	0.7453285944333806	104947
100	71363960	104948	0.8414643442466745	0.9183500400198193	0.7302473605976293	104948
100	71468907	104947	0.856070206866323	0.9208838747177146	0.7432227695884589	104947
100	71468907	104948	0.8405210199336814	0.9176735145024203	0.7285512825399245	104948
100	71573854	104947	0.8566419240187905	0.9215413494430522	0.7433466416381602	104947
100	71573854	104948	0.8402065784960171	0.9174353012920684	0.7285226969546823	104948
100	71678801	104947	0.8567276815916606	0.9213603056781042	0.744356675274186	104947
100	71678801	104948	0.8410832031101116	0.9187597667416244	0.7294564927392614	104948
100	71783748	104947	0.8550601732302971	0.9223417534565066	0.7435181567839004	104947
100	71783748	104948	0.839920722643595	0.9181213553378816	0.7288752525060029	104948
100	71888695	104947	0.8557748196708815	0.9205884875222732	0.7408787292633424	104947
100	71888695	104948	0.839272782711438	0.9176830430308344	0.7277890002667988	104948
100	71993642	104947	0.8551078163263361	0.921960608688195	0.7440231736019134	104947
100	71993642	104948	0.8391870259557114	0.9183500400198193	0.7284274116705416	104948
100	72098589	104947	0.8566419240187905	0.9224465682677924	0.7438230725985497	104947
100	72098589	104948	0.841397644547776	0.9189884514235621	0.7286465678240652	104948
100	72203536	104947	0.8566514526379982	0.9217509790656236	0.7429750254890564	104947
100	72203536	104948	0.8411499028090101	0.9180165415253269	0.7277699432099707	104948
100	72308483	104947	0.8564037085385957	0.9225132686022468	0.7434323992110303	104947
100	72308483	104948	0.8394728818081335	0.918712124099554	0.7281891984601898	104948
100	72413430	104947	0.856432294396219	0.921960608688195	0.7437087291680563	104947
100	72413430	104948	0.8391584403704692	0.9176925715592484	0.7293612074551207	104948
100	72518377	104947	0.8572041125520501	0.9216271070159223	0.7432513554460823	104947
100	72518377	104948	0.8407687616724473	0.9172828448374433	0.7280081564203225	104948
100	72623324	104947	0.856432294396219	0.9213507770588963	0.7438421298369653	104947
100	72623324	104948	0.8400827076266342	0.9179403132980143	0.7288180813355185	104948
100	72728271	104947	0.8564894661114658	0.9220368376418573	0.7432513554460823	104947
100	72728271	104948	0.840320920836986	0.9182547547356786	0.7287894957502763	104948
100	72833218	104947	0.8572136411712579	0.9214079487741431	0.7443376180357705	104947
100	72833218	104948	0.8404733772916111	0.9179403132980143	0.7288180813355185	104948
100	72938165	104947	0.857347041840167	0.9213984201549353	0.743356170257368	104947
100	72938165	104948	0.8409307466554865	0.9183214544345771	0.7284464687273697	104948
100	73043112	104947	0.8560320923894918	0.9221607096915586	0.7435086281646927	104947
100	73043112	104948	0.8397777947173839	0.9188645805541792	0.7280939131760491	104948
100	73148059	104947	0.8573375132209592	0.9214746491085977	0.7437849581217186	104947
100	73148059	104948	0.8407497046156192	0.9175972862751077	0.7280176849487365	104948
100	73253006	104947	0.856070206866323	0.9219034369729483	0.7436134429759783	104947
100	73253006	104948	0.8403971490642985	0.9193505355032968	0.7278556999656973	104948
100	73357953	104947	0.8577663010853097	0.9218939083537404	0.7444900759430951	104947
100	73357953	104948	0.8405591340473377	0.918578724701757	0.7288752525060029	104948
100	73462900	104947	0.8581665030920369	0.9211125615787016	0.7448616920921989	104947
100	73462900	104948	0.8415977436444716	0.9186454244006556	0.7296089491938865	104948
100	73567847	104947	0.8572612842672969	0.921960608688195	0.7442995035589393	104947
100	73567847	104948	0.8399493082288372	0.9178831421275299	0.7288657239775889	104948
100	73672794	104947	0.8576996007508552	0.9217605076848314	0.743537214022316	104947
100	73672794	104948	0.8406448908030644	0.9181499409231239	0.7288561954491748	104948
100	73777741	104947	0.8565942809227515	0.9215413494430522	0.7430036113466798	104947
100	73777741	104948	0.8401494073255327	0.9181880550367801	0.726759919198079	104948
100	73882688	104947	0.856708624353245	0.9216556928735457	0.7420888639027319	104947
100	73882688	104948	0.8390917406715707	0.9184739108892023	0.7280653275908069	104948
100	73987635	104947	0.8568324964029462	0.9226847837479871	0.7425557662439136	104947
100	73987635	104948	0.8405400769905096	0.918569196173343	0.7280272134771506	104948
100	74092582	104947	0.8577091293700629	0.9216271070159223	0.7451475506684326	104947
100	74092582	104948	0.8407687616724473	0.9177497427297329	0.7295994206654725	104948
100	74197529	104947	0.856613338161167	0.9216747501119613	0.7433656988765758	104947
100	74197529	104948	0.8406258337462362	0.9178354994854595	0.728341654914815	104948
100	74302476	104947	0.8575662000819462	0.9210649184826627	0.7447282914232899	104947
100	74302476	104948	0.840187521439189	0.9176068148035218	0.7281891984601898	104948
100	74407423	104947	0.8560320923894918	0.9215889925390912	0.7427463386280694	104947
100	74407423	104948	0.8396634523764149	0.9185977817585852	0.7285989251819949	104948
100	74512370	104947	0.8574042135554136	0.921503234966221	0.7448807493306145	104947
100	74512370	104948	0.8409021610702443	0.918578724701757	0.7291706368868393	104948
100	74617317	104947	0.8578997017542188	0.9214270060125587	0.7446997055656664	104947
100	74617317	104948	0.8418359568548234	0.9181118268094676	0.7288943095628311	104948
100	74722264	104947	0.8571469408368033	0.9225323258406625	0.7445472476583418	104947
100	74722264	104948	0.8409783892975569	0.9188741090825933	0.7289038380912451	104948
100	74827211	104947	0.8572041125520501	0.9215222922046367	0.7442423318436925	104947
100	74827211	104948	0.841397644547776	0.9184643823607882	0.7280939131760491	104948
100	74932158	104947	0.8569182539758163	0.9218462652577015	0.743813543979342	104947
100	74932158	104948	0.8408831040134161	0.9189026946678355	0.7287894957502763	104948
100	75037105	104947	0.8562036075352321	0.9213031339628575	0.7435562712607316	104947
100	75037105	104948	0.8393585394671647	0.9188645805541792	0.7288180813355185	104948
100	75142052	104947	0.8567943819261151	0.9221702383107664	0.74408034531716	104947
100	75142052	104948	0.839787323245798	0.9190646796508747	0.7291039371879406	104948
100	75246999	104947	0.8573756276977903	0.9218939083537404	0.743861187075381	104947
100	75246999	104948	0.8409879178259709	0.9182547547356786	0.7299138621031368	104948
100	75351946	104947	0.8568324964029462	0.9223131675988833	0.7434419278302381	104947
100	75351946	104948	0.8397777947173839	0.9182642832640927	0.7294564927392614	104948
100	75456893	104947	0.856660981257206	0.9212459622476107	0.7446139479927963	104947
100	75456893	104948	0.8396062812059305	0.918168997979952	0.7287894957502763	104948
100	75561840	104947	0.8561083213431542	0.9221797669299742	0.7443947897510171	104947
100	75561840	104948	0.8392632541830239	0.9187692952700385	0.7295898921370584	104948
100	75666787	104947	0.8562798364888944	0.9220368376418573	0.744766405900121	104947
100	75666787	104948	0.8403113923085719	0.9190170370088043	0.7298281053474102	104948
100	75771734	104947	0.8573279846017514	0.9212936053436497	0.7450713217147703	104947
100	75771734	104948	0.8405305484620955	0.9187311811563822	0.7292563936425659	104948
100	75876681	104947	0.8566038095419592	0.9229039419897662	0.7440041163634977	104947
100	75876681	104948	0.8390726836147425	0.920027061020696	0.7299996188588634	104948
100	75981628	104947	0.8579282876118421	0.9224179824101689	0.7439469446482511	104947
100	75981628	104948	0.840320920836986	0.9188074093836948	0.728617982238823	104948
100	76086575	104947	0.8572803415057124	0.9218939083537404	0.7446806483272509	104947
100	76086575	104948	0.8403495064222282	0.9184072111903038	0.7293802645119488	104948
100	76191522	104947	0.8557462338132581	0.9223417534565066	0.7444424328470561	104947
100	76191522	104948	0.8392156115409536	0.9187692952700385	0.7286084537104089	104948
100	76296469	104947	0.8552888600912841	0.9222369386452209	0.7451951937644716	104947
100	76296469	104948	0.8392060830125395	0.9184834394176163	0.7286656248808934	104948
100	76401416	104947	0.8554127321409855	0.9223893965525456	0.7450236786187313	104947
100	76401416	104948	0.8393490109387506	0.9184548538323741	0.7292945077562222	104948
100	76506363	104947	0.855936806197414	0.9217605076848314	0.7441946887476536	104947
100	76506363	104948	0.8398730800015246	0.9184548538323741	0.7272172885619544	104948
100	76611310	104947	0.8550506446110894	0.922055894880273	0.7448045203769522	104947
100	76611310	104948	0.8384247436825857	0.9188645805541792	0.7296470633075428	104948
100	76716257	104947	0.8556509476211802	0.9221797669299742	0.743813543979342	104947
100	76716257	104948	0.8398921370583528	0.9195125204863361	0.7286751534093074	104948
100	76821204	104947	0.8562417220120633	0.9218939083537404	0.7445472476583418	104947
100	76821204	104948	0.8399111941151809	0.9192743072759842	0.7285608110683386	104948
100	76926151	104947	0.856432294396219	0.9221892955491819	0.7444233756086406	104947
100	76926151	104948	0.8402065784960171	0.9191313793497732	0.7288657239775889	104948
100	77031098	104947	0.8558129341477126	0.9223893965525456	0.7439088301714198	104947
100	77031098	104948	0.838338986926859	0.9197030910546176	0.7281891984601898	104948
100	77136045	104947	0.8554889610946478	0.9223036389796755	0.743765900883303	104947
100	77136045	104948	0.8381865304722339	0.9191694934634295	0.7283607119716431	104948
100	77240992	104947	0.8554984897138556	0.9223036389796755	0.7444329042278484	104947
100	77240992	104948	0.8394157106376491	0.9187597667416244	0.7290562945458703	104948
100	77345939	104947	0.8556128331443491	0.9227610127016495	0.7437182577872641	104947
100	77345939	104948	0.8395681670922742	0.918569196173343	0.7278271143804551	104948
100	77450886	104947	0.8555651900483101	0.92237033931413	0.743403813353407	104947
100	77450886	104948	0.8400350649845638	0.9193410069748827	0.7282844837443305	104948
100	77555833	104947	0.8562703078696866	0.9221416524531431	0.7441565742708224	104947
100	77555833	104948	0.8396443953195868	0.9186930670427259	0.7297804627053398	104948
100	77660780	104947	0.8552316883760375	0.9217605076848314	0.7440708166979523	104947
100	77660780	104948	0.8398540229446965	0.9187978808552807	0.7276556008690018	104948
100	77765727	104947	0.855841520005336	0.9226085547943248	0.7445472476583418	104947
100	77765727	104948	0.8397492091321417	0.9194077066737814	0.7285608110683386	104948
100	77870674	104947	0.8553841462833621	0.9219796659266106	0.7443852611318094	104947
100	77870674	104948	0.8397587376605558	0.9191313793497732	0.7280367420055647	104948
100	77975621	104947	0.8562893651081022	0.9222941103604677	0.7443757325126016	104947
100	77975621	104948	0.8396539238480009	0.9192647787475702	0.728484582841026	104948
100	78080568	104947	0.8562321933928554	0.9226752551287792	0.7454715237214975	104947
100	78080568	104948	0.8408259328429317	0.9188645805541792	0.7284559972557838	104948
100	78185515	104947	0.8568134391645307	0.9223608106949222	0.7435181567839004	104947
100	78185515	104948	0.840320920836986	0.9194648778442658	0.728208255517018	104948
100	78290462	104947	0.8556318903827647	0.9222655245028443	0.7448045203769522	104947
100	78290462	104948	0.8392060830125395	0.9193410069748827	0.7285512825399245	104948
100	78395409	104947	0.8554508466178166	0.9221511810723508	0.7436039143567705	104947
100	78395409	104948	0.8397492091321417	0.9188741090825933	0.7279319281930099	104948
100	78500356	104947	0.8551268735647517	0.9212173763899874	0.7439374160290433	104947
100	78500356	104948	0.8398349658878683	0.9184929679460304	0.7281891984601898	104948
100	78605303	104947	0.8563560654425567	0.9228562988937273	0.7442042173668614	104947
100	78605303	104948	0.8393585394671647	0.9193410069748827	0.7287704386934482	104948
100	78710250	104947	0.8559939779126606	0.9230373426586753	0.7435086281646927	104947
100	78710250	104948	0.8392537256546099	0.9196173342988909	0.7278842855509395	104948
100	78815197	104947	0.8560987927239464	0.9223798679333378	0.7446044193735886	104947
100	78815197	104948	0.8396348667911727	0.9191313793497732	0.7289514807333155	104948
100	78920144	104947	0.8560606782471152	0.9225132686022468	0.7441470456516146	104947
100	78920144	104948	0.840054122041392	0.9195982772420628	0.7285322254830964	104948
100	79025091	104947	0.8561273785815697	0.9222369386452209	0.7437754295025107	104947
100	79025091	104948	0.8397587376605558	0.9192933643328124	0.7284559972557838	104948
100	79130038	104947	0.8562607792504788	0.9215604066814678	0.7437563722640952	104947
100	79130038	104948	0.8402065784960171	0.9183500400198193	0.7282558981590883	104948
100	79234985	104947	0.8554413179986088	0.9215318208238444	0.7429083251546018	104947
100	79234985	104948	0.8393490109387506	0.9186644814574837	0.7278461714372833	104948
100	79339932	104947	0.8560416210086996	0.9225990261751169	0.7447759345193288	104947
100	79339932	104948	0.8404829058200252	0.9189122231962495	0.728475054312612	104948
100	79444879	104947	0.855936806197414	0.9224465682677924	0.744127988413199	104947
100	79444879	104948	0.8402828067233297	0.9192838358043983	0.7293326218698785	104948
100	79549826	104947	0.8546504426043622	0.9217700363040392	0.7425748234823292	104947
100	79549826	104948	0.8399874223424935	0.918712124099554	0.7275603155848611	104948
100	79654773	104947	0.8558034055285049	0.9228467702745196	0.743861187075381	104947
100	79654773	104948	0.8391965544841254	0.9195887487136487	0.7278747570225255	104948
100	79759720	104947	0.8556985907172192	0.9221988241683897	0.7425748234823292	104947
100	79759720	104948	0.8401398787971186	0.9190837367077028	0.7271410603346419	104948
100	79864667	104947	0.8559844492934529	0.9217986221616625	0.7446520624696276	104947
100	79864667	104948	0.8400731790982201	0.9185596676449289	0.7278652284941114	104948
100	79969614	104947	0.8557748196708815	0.9229515850858052	0.743632500214394	104947
100	79969614	104948	0.8391203262568129	0.9195506345999924	0.7279033426077677	104948
100	80074561	104947	0.8564894661114658	0.9218653224961171	0.7433371130189524	104947
100	80074561	104948	0.8404257346495407	0.9185882532301711	0.7292373365857377	104948
100	80179508	104947	0.8559463348166217	0.9220273090226495	0.7439469446482511	104947
100	80179508	104948	0.8405400769905096	0.9195792201852346	0.729275450699394	104948
100	80284455	104947	0.8560035065318685	0.9221797669299742	0.7430417258235109	104947
100	80284455	104948	0.8390917406715707	0.9191599649350154	0.7285417540115104	104948
100	80389402	104947	0.8561369072007775	0.9222559958836365	0.7432704126844979	104947
100	80389402	104948	0.8398444944162824	0.9188931661394214	0.7283797690284712	104948
100	80494349	104947	0.8557748196708815	0.9220940093571041	0.744585362135173	104947
100	80494349	104948	0.839406182109235	0.918578724701757	0.7281606128749476	104948
100	80599296	104947	0.8565371092075047	0.9227038409864027	0.7443852611318094	104947
100	80599296	104948	0.8398159088310402	0.9190456225940465	0.7287227960513778	104948
100	80704243	104947	0.8564608802538424	0.9226276120327404	0.7434705136878615	104947
100	80704243	104948	0.840197049967603	0.9194648778442658	0.7288657239775889	104948
100	80809190	104947	0.8560035065318685	0.9226657265095715	0.7451094361916015	104947
100	80809190	104948	0.8402542211380875	0.9194077066737814	0.7294660212676755	104948
100	80914137	104947	0.8556033045251412	0.9227133696056105	0.7436420288336018	104947
100	80914137	104948	0.8386915424781797	0.9191504364066013	0.7282940122727446	104948
100	81019084	104947	0.8559558634358295	0.9228848847513507	0.7438421298369653	104947
100	81019084	104948	0.8397492091321417	0.9194362922590236	0.72741738765865	104948
100	81124031	104947	0.8559558634358295	0.9218081507808703	0.743308527161329	104947
100	81124031	104948	0.8409402751839006	0.9195982772420628	0.7278366429088692	104948
100	81228978	104947	0.8562798364888944	0.9228372416553118	0.7438516584561731	104947
100	81228978	104948	0.8396443953195868	0.9193314784464687	0.7291420513015969	104948
100	81333925	104947	0.8549553584190115	0.9228658275129351	0.7437563722640952	104947
100	81333925	104948	0.840454320234783	0.9186168388154133	0.7283702405000572	104948
100	81438872	104947	0.8552983887104919	0.9224275110293767	0.7430893689195499	104947
100	81438872	104948	0.8388439989328048	0.9187597667416244	0.7289895948469718	104948
100	81543819	104947	0.8557557624324659	0.9222845817412598	0.7435181567839004	104947
100	81543819	104948	0.8404257346495407	0.9187407096847963	0.7293897930403629	104948
100	81648766	104947	0.8557748196708815	0.9219034369729483	0.7447949917577444	104947
100	81648766	104948	0.8407592331440332	0.9184929679460304	0.7293040362846362	104948
100	81753713	104947	0.8555175469522711	0.9222559958836365	0.7440041163634977	104947
100	81753713	104948	0.8406163052178222	0.9185882532301711	0.7285608110683386	104948
100	81858660	104947	0.8552793314720764	0.9225799689367014	0.7444424328470561	104947
100	81858660	104948	0.8400255364561497	0.9195220490147502	0.7287323245797919	104948
100	81963607	104947	0.8560797354855308	0.9218939083537404	0.7436515574528095	104947
100	81963607	104948	0.8412833022068071	0.9188741090825933	0.7285893966535808	104948
100	82068554	104947	0.856117849962362	0.9222655245028443	0.7434133419726148	104947
100	82068554	104948	0.8399874223424935	0.9193886496169532	0.7289419522049014	104948
100	82173501	104947	0.8550506446110894	0.9229896995626363	0.7436039143567705	104947
100	82173501	104948	0.8389011701032892	0.9198174333955864	0.7282558981590883	104948
100	82278448	104947	0.8565275805882969	0.9223798679333378	0.7431179547771732	104947
100	82278448	104948	0.8408831040134161	0.9196935625262035	0.7282463696306742	104948
100	82383395	104947	0.8562226647736476	0.9232850867580779	0.7439374160290433	104947
100	82383395	104948	0.8398635514731105	0.9195887487136487	0.7281034417044632	104948
100	82488342	104947	0.8560130351510763	0.9228372416553118	0.7437944867409264	104947
100	82488342	104948	0.8405114914052674	0.9196173342988909	0.7283988260852994	104948
100	82593289	104947	0.8558796344821672	0.9220749521186885	0.7432894699229134	104947
100	82593289	104948	0.840197049967603	0.9179974844684987	0.7298852765178946	104948
100	82698236	104947	0.856613338161167	0.9223608106949222	0.7446139479927963	104947
100	82698236	104948	0.8406353622746503	0.9185501391165148	0.7305999161489499	104948
100	82803183	104947	0.8563274795849334	0.923142157469961	0.7437944867409264	104947
100	82803183	104948	0.8402446926096734	0.9194553493158517	0.7288657239775889	104948
100	82908130	104947	0.8566895671148294	0.9221511810723508	0.7427844531049006	104947
100	82908130	104948	0.8417311430422686	0.9186073102869993	0.728617982238823	104948
100	83013077	104947	0.8567467388300761	0.9229706423242208	0.7425081231478746	104947
100	83013077	104948	0.840997446354385	0.9195411060715784	0.7282368411022602	104948
100	83118024	104947	0.8571469408368033	0.9227514840824416	0.7434895709262771	104947
100	83118024	104948	0.8411117886953539	0.9191885505202576	0.7291896939436674	104948
100	83222971	104947	0.8555270755714789	0.9216271070159223	0.7436706146912251	104947
100	83222971	104948	0.8405114914052674	0.9184929679460304	0.7288180813355185	104948
100	83327918	104947	0.8563560654425567	0.9224465682677924	0.7448902779498223	104947
100	83327918	104948	0.8412928307352212	0.9191218508213591	0.7299519762167931	104948
100	83432865	104947	0.8562131361544398	0.9214460632509743	0.74399458774429	104947
100	83432865	104948	0.8399016655867668	0.9182356976788505	0.7298376338758242	104948
100	83537812	104947	0.8569658970718553	0.9229420564665974	0.7430703116811342	104947
100	83537812	104948	0.8403876205358844	0.9190170370088043	0.7290848801311125	104948
100	83642759	104947	0.8562893651081022	0.9222083527875975	0.7434419278302381	104947
100	83642759	104948	0.840063650569806	0.9194172352021954	0.7286084537104089	104948
100	83747706	104947	0.8566514526379982	0.9222178814068054	0.744127988413199	104947
100	83747706	104948	0.8403495064222282	0.9195887487136487	0.7284274116705416	104948
100	83852653	104947	0.8569277825950241	0.9218939083537404	0.7432989985421212	104947
100	83852653	104948	0.8403685634790563	0.919112322292945	0.7287990242786904	104948
100	83957600	104947	0.8565275805882969	0.9228372416553118	0.7446615910888353	104947
100	83957600	104948	0.8411022601669398	0.9199508327933834	0.7294183786256051	104948
100	84062547	104947	0.8556128331443491	0.9223989251717534	0.7442042173668614	104947
100	84062547	104948	0.8404066775927126	0.9189884514235621	0.72875138163662	104948
100	84167494	104947	0.8560130351510763	0.9225227972214547	0.7435276854031082	104947
100	84167494	104948	0.8400445935129779	0.9195792201852346	0.7291611083584251	104948
100	84272441	104947	0.8568229677837385	0.9221892955491819	0.7443185607973548	104947
100	84272441	104948	0.8410546175248694	0.9184548538323741	0.7299996188588634	104948
100	84377388	104947	0.8567657960684917	0.9223989251717534	0.742946439631433	104947
100	84377388	104948	0.8399588367572512	0.9189312802530777	0.7284178831421275	104948
100	84482335	104947	0.8566323953995827	0.9221702383107664	0.7439469446482511	104947
100	84482335	104948	0.8413118877920494	0.9185882532301711	0.7298471624042383	104948
100	84587282	104947	0.8561845502968165	0.9219796659266106	0.7444329042278484	104947
100	84587282	104948	0.8404829058200252	0.9189217517246636	0.7293135648130503	104948
100	84692229	104947	0.8560320923894918	0.9224846827446235	0.7439469446482511	104947
100	84692229	104948	0.8410832031101116	0.9189884514235621	0.7301425467850745	104948
100	84797176	104947	0.8560892641047386	0.9225704403174936	0.7422984935253032	104947
100	84797176	104948	0.8402256355528452	0.9186930670427259	0.7285036398978542	104948
100	84902123	104947	0.8557748196708815	0.9226180834135326	0.7434705136878615	104947
100	84902123	104948	0.8400445935129779	0.9193219499180546	0.7286942104661356	104948
100	85007070	104947	0.8554603752370245	0.9216461642543379	0.7423937797173812	104947
100	85007070	104948	0.8404829058200252	0.9182356976788505	0.729408850097191	104948
100	85112017	104947	0.8566038095419592	0.922008251784234	0.7444614900854717	104947
100	85112017	104948	0.840320920836986	0.9187692952700385	0.7303712314670122	104948
100	85216964	104947	0.855660476240388	0.9223512820757144	0.744127988413199	104947
100	85216964	104948	0.840320920836986	0.9184929679460304	0.729018180432214	104948
100	85321911	104947	0.8553269745681154	0.9221797669299742	0.7442137459860692	104947
100	85321911	104948	0.8403018637801578	0.9192552502191561	0.729275450699394	104948
100	85426858	104947	0.8553269745681154	0.9218176794000781	0.7432989985421212	104947
100	85426858	104948	0.839672980904829	0.918578724701757	0.7286846819377215	104948
100	85531805	104947	0.8555366041906868	0.9220368376418573	0.7444614900854717	104947
100	85531805	104948	0.8401779929107749	0.9188741090825933	0.7292182795289096	104948
100	85636752	104947	0.8567943819261151	0.9227038409864027	0.7438802443137965	104947
100	85636752	104948	0.8409116895986584	0.9188741090825933	0.7292278080573237	104948
100	85741699	104947	0.8560130351510763	0.9224275110293767	0.7439183587906276	104947
100	85741699	104948	0.8413214163204634	0.919245721690742	0.730209246483973	104948
100	85846646	104947	0.8558605772437516	0.9217986221616625	0.743765900883303	104947
100	85846646	104948	0.8405114914052674	0.9190360940656325	0.728617982238823	104948
100	85951593	104947	0.855841520005336	0.9222655245028443	0.7429559682506408	104947
100	85951593	104948	0.8396920379616573	0.918835994968937	0.7288085528071044	104948
100	86056540	104947	0.8555270755714789	0.9228086557976883	0.744127988413199	104947
100	86056540	104948	0.8405591340473377	0.9191504364066013	0.728475054312612	104948
100	86161487	104947	0.8556985907172192	0.922961113705013	0.7449093351882379	104947
100	86161487	104948	0.8404352631779548	0.9189884514235621	0.7300186759156916	104948
100	86266434	104947	0.8563941799193879	0.9220844807378963	0.7443280894165627	104947
100	86266434	104948	0.8397587376605558	0.9190360940656325	0.7283035408011587	104948
100	86371381	104947	0.8552793314720764	0.9217223932080002	0.7435562712607316	104947
100	86371381	104948	0.8403876205358844	0.9184453253039601	0.7289514807333155	104948
100	86476328	104947	0.8557557624324659	0.9218939083537404	0.7445472476583418	104947
100	86476328	104948	0.8411499028090101	0.9181594694515379	0.7289610092617296	104948
100	86581275	104947	0.856432294396219	0.9227895985592728	0.7444805473238872	104947
100	86581275	104948	0.8407401760872051	0.9194362922590236	0.7300949041430042	104948
100	86686222	104947	0.8550887590879206	0.9224179824101689	0.7436610860720173	104947
100	86686222	104948	0.8399778938140794	0.9190551511224606	0.7295613065518162	104948
100	86791169	104947	0.8560416210086996	0.9222559958836365	0.7452904799565495	104947
100	86791169	104948	0.8394157106376491	0.9189884514235621	0.7295136639097458	104948
100	86896116	104947	0.8565752236843359	0.9223989251717534	0.7447092341848742	104947
100	86896116	104948	0.8410450889964554	0.9184453253039601	0.7297423485916835	104948
100	87001063	104947	0.8553746176641542	0.9222845817412598	0.7430893689195499	104947
100	87001063	104948	0.840320920836986	0.918712124099554	0.7292849792278081	104948
100	87106010	104947	0.8556509476211802	0.9221035379763118	0.7428416248201473	104947
100	87106010	104948	0.8397015664900713	0.9187692952700385	0.7286465678240652	104948
100	87210957	104947	0.8556985907172192	0.9213317198204808	0.7443757325126016	104947
100	87210957	104948	0.8412356595647368	0.9183976826618897	0.7298662194610664	104948
100	87315904	104947	0.8559272775782062	0.9225609116982858	0.7446997055656664	104947
100	87315904	104948	0.8403590349506422	0.9188741090825933	0.7300186759156916	104948
100	87420851	104947	0.8553936749025699	0.9216271070159223	0.7430226685850954	104947
100	87420851	104948	0.8398444944162824	0.9178640850707017	0.727550787056447	104948
100	87525798	104947	0.8555937759059334	0.9210744471018705	0.7445472476583418	104947
100	87525798	104948	0.8401208217402905	0.9173019018942714	0.7296184777223006	104948
100	87630745	104947	0.8559082203397905	0.9215604066814678	0.7430131399658876	104947
100	87630745	104948	0.8398444944162824	0.9180832412242254	0.727941456721424	104948
100	87735692	104947	0.8562036075352321	0.9221321238339353	0.7447187628040821	104947
100	87735692	104948	0.8409212181270724	0.9185406105881008	0.7298852765178946	104948
100	87840639	104947	0.8560225637702841	0.9218462652577015	0.7438230725985497	104947
100	87840639	104948	0.8406925334451347	0.9182166406220224	0.7288657239775889	104948
100	87945586	104947	0.8562988937273099	0.9222941103604677	0.7436706146912251	104947
100	87945586	104948	0.840864046956588	0.918845523497351	0.7293040362846362	104948
100	88050533	104947	0.8566419240187905	0.9225323258406625	0.7442232746052769	104947
100	88050533	104948	0.841654914814956	0.9181022982810535	0.7297042344780272	104948
100	88155480	104947	0.8565085233498814	0.9219796659266106	0.7441946887476536	104947
100	88155480	104948	0.8409783892975569	0.9182833403209208	0.729809048290582	104948
100	88260427	104947	0.8564608802538424	0.9212650194860262	0.7452618940989261	104947
100	88260427	104948	0.8407497046156192	0.9184548538323741	0.7290753516026984	104948
100	88365374	104947	0.8558986917205827	0.9205694302838575	0.7448426348537833	104947
100	88365374	104948	0.8407020619735488	0.9170827457407478	0.7299615047452072	104948
100	88470321	104947	0.8542788264552583	0.9226561978903637	0.7441470456516146	104947
100	88470321	104948	0.8388249418759767	0.9173304874795136	0.7297137630064413	104948
100	88575268	104947	0.8557748196708815	0.9214555918701821	0.7443280894165627	104947
100	88575268	104948	0.8401779929107749	0.9178354994854595	0.7296375347791287	104948
100	88680215	104947	0.8560892641047386	0.9213888915357276	0.7450332072379392	104947
100	88680215	104948	0.841778785684339	0.9185215535312726	0.7299900903304494	104948
100	88785162	104947	0.855660476240388	0.9213031339628575	0.7442042173668614	104947
100	88785162	104948	0.8398063803026261	0.9175020009909669	0.7300091473872775	104948
100	88890109	104947	0.8553936749025699	0.9212936053436497	0.7421365069987708	104947
100	88890109	104948	0.8403590349506422	0.9177211571444907	0.7293612074551207	104948
100	88995056	104947	0.8554603752370245	0.922055894880273	0.7430226685850954	104947
100	88995056	104948	0.8399016655867668	0.9181880550367801	0.7291611083584251	104948
100	89100003	104947	0.856117849962362	0.9220368376418573	0.7448235776153678	104947
100	89100003	104948	0.8408259328429317	0.9186454244006556	0.7300377329725197	104948
100	89204950	104947	0.8553936749025699	0.9219891945458184	0.7437849581217186	104947
100	89204950	104948	0.8397777947173839	0.9171589739680603	0.7297423485916835	104948
100	89309897	104947	0.8556033045251412	0.9211602046747406	0.7439660018866666	104947
100	89309897	104948	0.8403876205358844	0.9169398178145367	0.7296851774211991	104948
100	89414844	104947	0.8553555604257387	0.9211792619131561	0.744585362135173	104947
100	89414844	104948	0.839406182109235	0.9178736135991158	0.7296565918359569	104948
100	89519791	104947	0.8566800384956216	0.9213888915357276	0.7451189648108093	104947
100	89519791	104948	0.8410641460532835	0.9179212562411861	0.7291611083584251	104948
100	89624738	104947	0.8548600722269336	0.9208076457640524	0.7443852611318094	104947
100	89624738	104948	0.8405877196325799	0.9170636886839196	0.7298185768189961	104948
100	89729685	104947	0.8551649880415829	0.9220177804034417	0.7443280894165627	104947
100	89729685	104948	0.8392251400693677	0.9176639859740062	0.7290944086595266	104948
100	89834632	104947	0.8565656950651281	0.9219129655921561	0.745404823387043	104947
100	89834632	104948	0.8407401760872051	0.9182547547356786	0.7302568891260434	104948
100	89939579	104947	0.8555937759059334	0.9217605076848314	0.7443280894165627	104947
100	89939579	104948	0.8410832031101116	0.9182356976788505	0.7299519762167931	104948
100	90044526	104947	0.8550887590879206	0.9216747501119613	0.743584857118355	104947
100	90044526	104948	0.8397396806037276	0.9178164424286314	0.7278271143804551	104948
100	90149473	104947	0.8550982877071284	0.922008251784234	0.7440517594595367	104947
100	90149473	104948	0.839930251172009	0.9181785265083661	0.7297804627053398	104948
100	90254420	104947	0.8566323953995827	0.9213412484396886	0.7449283924266534	104947
100	90254420	104948	0.8410927316385257	0.9186073102869993	0.7300472615009338	104948
100	90359367	104947	0.8557462338132581	0.9209982181482081	0.7433371130189524	104947
100	90359367	104948	0.8410165034112131	0.9180546556389831	0.7295898921370584	104948
100	90464314	104947	0.8554127321409855	0.9213603056781042	0.743537214022316	104947
100	90464314	104948	0.8394728818081335	0.9174734154057247	0.7283607119716431	104948
100	90569261	104947	0.8568801394989852	0.9209315178137536	0.7435753284991472	104947
100	90569261	104948	0.8408926325418302	0.9179974844684987	0.7290753516026984	104948
100	90674208	104947	0.8563560654425567	0.9220463662610651	0.7441184597939913	104947
100	90674208	104948	0.8401589358539467	0.9180927697526394	0.729532720966574	104948
100	90779155	104947	0.8559844492934529	0.9215413494430522	0.7431655978732122	104947
100	90779155	104948	0.8399778938140794	0.9177116286160766	0.7298376338758242	104948
100	90884102	104947	0.8554603752370245	0.922055894880273	0.7434323992110303	104947
100	90884102	104948	0.8400827076266342	0.9183786256050616	0.729408850097191	104948
100	90989049	104947	0.8555080183330633	0.9219129655921561	0.7443662038933938	104947
100	90989049	104948	0.8398540229446965	0.9181118268094676	0.7285322254830964	104948
100	91093996	104947	0.8561464358199853	0.9220368376418573	0.7444519614662639	104947
100	91093996	104948	0.8419026565537219	0.9188074093836948	0.7281891984601898	104948
100	91198943	104947	0.8566419240187905	0.9211316188171172	0.744175631509238	104947
100	91198943	104948	0.8407592331440332	0.917387658649998	0.7287799672218622	104948
100	91303890	104947	0.8551649880415829	0.9208743460985068	0.7436134429759783	104947
100	91303890	104948	0.8398635514731105	0.917635400388764	0.7278366429088692	104948
100	91408837	104947	0.8558224627669204	0.9199310127969356	0.743356170257368	104947
100	91408837	104948	0.840063650569806	0.9164824484506613	0.727550787056447	104948
100	91513784	104947	0.855431789379401	0.921779564923247	0.7450332072379392	104947
100	91513784	104948	0.8406639478598925	0.9183119259061631	0.7297804627053398	104948
100	91618731	104947	0.8568610822605697	0.9207600026680134	0.7441375170324068	104947
100	91618731	104948	0.8416930289286123	0.917387658649998	0.7289324236764874	104948
100	91723678	104947	0.8558319913861282	0.9214651204893899	0.7430036113466798	104947
100	91723678	104948	0.8403113923085719	0.917635400388764	0.729532720966574	104948
100	91828625	104947	0.8559939779126606	0.9213507770588963	0.7434990995454849	104947
100	91828625	104948	0.8411499028090101	0.9174162442352403	0.7288752525060029	104948
100	91933572	104947	0.855841520005336	0.9211602046747406	0.7437563722640952	104947
100	91933572	104948	0.8400160079277357	0.9179593703548424	0.7281987269886039	104948
100	92038519	104947	0.8553936749025699	0.9203979151381173	0.7432989985421212	104947
100	92038519	104948	0.8399969508709075	0.9167397187178412	0.7281987269886039	104948
100	92143466	104947	0.8552983887104919	0.921141147436325	0.7440422308403289	104947
100	92143466	104948	0.8393966535808209	0.9176925715592484	0.7286942104661356	104948
100	92248413	104947	0.8549458297998037	0.921779564923247	0.7428702106777707	104947
100	92248413	104948	0.8405496055189237	0.9180260700537409	0.7285512825399245	104948
100	92353360	104947	0.8559272775782062	0.9215604066814678	0.7443471466549783	104947
100	92353360	104948	0.8398540229446965	0.9186263673438274	0.7280653275908069	104948
100	92458307	104947	0.8563655940617645	0.9208552888600913	0.7453285944333806	104947
100	92458307	104948	0.8415501010024012	0.9171685024964744	0.7290086519037999	104948
100	92563254	104947	0.8562988937273099	0.9211792619131561	0.7430703116811342	104947
100	92563254	104948	0.8407497046156192	0.917902199184358	0.728208255517018	104948
100	92668201	104947	0.8559844492934529	0.9206456592375198	0.7442232746052769	104947
100	92668201	104948	0.8410736745816976	0.917768799786561	0.7283321263864009	104948
100	92773148	104947	0.855660476240388	0.9212364336284029	0.7434609850686537	104947
100	92773148	104948	0.840187521439189	0.9173114304226855	0.7279986278919084	104948
100	92878095	104947	0.8557557624324659	0.9211125615787016	0.7436801433104329	104947
100	92878095	104948	0.8405877196325799	0.9175687006898654	0.7281796699317757	104948
100	92983042	104947	0.8557652910516736	0.9199881845121823	0.7441946887476536	104947
100	92983042	104948	0.8408259328429317	0.9170065175134352	0.7283892975568853	104948
100	93087989	104947	0.8564989947306736	0.9203026289460394	0.744309032178147	104947
100	93087989	104948	0.8409688607691428	0.9165396196211457	0.7279700423066662	104948
100	93192936	104947	0.8559177489589983	0.9215699353006755	0.7437277864064719	104947
100	93192936	104948	0.8402637496665015	0.9186263673438274	0.7267789762549072	104948
100	93297883	104947	0.8556033045251412	0.9210077467674159	0.7447568772809132	104947
100	93297883	104948	0.8395586385638601	0.9172637877806151	0.7285893966535808	104948
100	93402830	104947	0.8550125301342583	0.9220177804034417	0.742584352101537	104947
100	93402830	104948	0.839539581507032	0.918435796775546	0.7277318290963144	104948
100	93507777	104947	0.8559082203397905	0.9204836727109874	0.7425271803862902	104947
100	93507777	104948	0.8402542211380875	0.9174448298204825	0.727150588863056	104948
100	93612724	104947	0.8557843482900893	0.920369329280494	0.7442899749397315	104947
100	93612724	104948	0.8413690589625338	0.9169207607577086	0.7280462705339787	104948
100	93717671	104947	0.8555175469522711	0.9214270060125587	0.7424985945286668	104947
100	93717671	104948	0.8414929298319168	0.9171875595533026	0.7271982315051263	104948
100	93822618	104947	0.8552221597568297	0.920416972376533	0.742222264571641	104947
100	93822618	104948	0.8392441971261958	0.9170351030986774	0.7270648321073293	104948
100	93927565	104947	0.8550125301342583	0.9212745481052341	0.742631995197576	104947
100	93927565	104948	0.8395491100354461	0.9175972862751077	0.7274936158859626	104948
100	94032512	104947	0.8559177489589983	0.9206170733798965	0.7438421298369653	104947
100	94032512	104948	0.8408926325418302	0.9169874604566071	0.7278652284941114	104948
100	94137459	104947	0.8560511496279074	0.9215794639198833	0.7430607830619265	104947
100	94137459	104948	0.8403113923085719	0.9178164424286314	0.7275126729427908	104948
100	94242406	104947	0.8549934728958427	0.9210363326250393	0.7445663048967575	104947
100	94242406	104948	0.8404829058200252	0.9172923733658573	0.727817585852041	104948
100	94347353	104947	0.855660476240388	0.9200358276082213	0.7446330052312119	104947
100	94347353	104948	0.840997446354385	0.9164157487517628	0.7275889011701033	104948
100	94452300	104947	0.8550982877071284	0.9208838747177146	0.7440327022211212	104947
100	94452300	104948	0.840597248160994	0.9172161451385448	0.7285322254830964	104948
100	94557247	104947	0.8567372102108683	0.9213888915357276	0.7430036113466798	104947
100	94557247	104948	0.8409498037123146	0.9170636886839196	0.7278747570225255	104948
100	94662194	104947	0.8552412169952452	0.9213888915357276	0.7445091331815107	104947
100	94662194	104948	0.8402351640812593	0.9174829439341388	0.7284083546137134	104948
100	94767141	104947	0.8555270755714789	0.9209505750521692	0.7429178537738096	104947
100	94767141	104948	0.8407782902008614	0.9173495445363418	0.7282463696306742	104948
100	94872088	104947	0.8559272775782062	0.9214841777278054	0.7450903789531859	104947
100	94872088	104948	0.840997446354385	0.9176068148035218	0.7285131684262682	104948
100	94977035	104947	0.8550506446110894	0.9211220901979095	0.7417839480880826	104947
100	94977035	104948	0.8409783892975569	0.9165110340359035	0.7272649312040248	104948
100	95081982	104947	0.8555747186675179	0.9203312148036628	0.7434419278302381	104947
100	95081982	104948	0.8404924343484392	0.9167778328314975	0.7279986278919084	104948
100	95186929	104947	0.8552126311376218	0.9207695312872212	0.7441661028900302	104947
100	95186929	104948	0.8410641460532835	0.916310934939208	0.729018180432214	104948
100	95291876	104947	0.8552888600912841	0.9212745481052341	0.7424890659094591	104947
100	95291876	104948	0.8395967526775164	0.9173114304226855	0.7269314327095323	104948
100	95396823	104947	0.8552698028528686	0.9214746491085977	0.7431370120155888	104947
100	95396823	104948	0.8400922361550482	0.9177116286160766	0.726893318595876	104948
100	95501770	104947	0.8541835402631804	0.9215699353006755	0.7432037123500433	104947
100	95501770	104948	0.8394347676944772	0.9175401151046232	0.7273506879597514	104948
100	95606717	104947	0.855250745614453	0.9211220901979095	0.7429083251546018	104947
100	95606717	104948	0.8402161070244312	0.9176163433319359	0.726750390669665	104948
100	95711664	104947	0.854755257415648	0.9213603056781042	0.742631995197576	104947
100	95711664	104948	0.840187521439189	0.9178069139002173	0.7282844837443305	104948
100	95816611	104947	0.8549839442766348	0.9202835717076239	0.7447092341848742	104947
100	95816611	104948	0.8387677707054922	0.9169684033997789	0.7288752525060029	104948
100	95921558	104947	0.8543074123128818	0.9208171743832602	0.7423080221445111	104947
100	95921558	104948	0.8394157106376491	0.9168064184167397	0.7272077600335404	104948
100	96026505	104947	0.8548219577501024	0.9209601036713769	0.7434514564494459	104947
100	96026505	104948	0.8393680679955787	0.9169112322292945	0.7274936158859626	104948
100	96131452	104947	0.854526570554661	0.9211506760555328	0.7428416248201473	104947
100	96131452	104948	0.8391012691999847	0.9171208598544041	0.7272744597324389	104948
100	96236399	104947	0.8548600722269336	0.9208267030024679	0.7433180557805369	104947
100	96236399	104948	0.8395967526775164	0.917521058047795	0.7269790753516027	104948
100	96341346	104947	0.8545075133162453	0.9207123595719744	0.7430417258235109	104947
100	96341346	104948	0.8388439989328048	0.9172733163090292	0.7272839882608529	104948
100	96446293	104947	0.8542692978360505	0.9207885885256367	0.7430988975387577	104947
100	96446293	104948	0.839796851774212	0.9168159469451538	0.7269790753516027	104948
100	96551240	104947	0.8542311833592194	0.9206361306183121	0.7430893689195499	104947
100	96551240	104948	0.8393680679955787	0.9173019018942714	0.7274269161870641	104948
100	96656187	104947	0.8548696008461414	0.921093504340286	0.7430036113466798	104947
100	96656187	104948	0.8410165034112131	0.9170160460418493	0.7284655257841979	104948
100	96761134	104947	0.8538690958293234	0.9207314168103901	0.7429178537738096	104947
100	96761134	104948	0.8394728818081335	0.9168159469451538	0.7274650303007204	104948
100	96866081	104947	0.8545646850314921	0.9205408444262342	0.7430512544427187	104947
100	96866081	104948	0.8390536265579144	0.917254259252201	0.728074856119221	104948
100	96971028	104947	0.8543836412665441	0.9216175783967145	0.7427939817241084	104947
100	96971028	104948	0.8409116895986584	0.9176449289171781	0.7265598201013835	104948
100	97075975	104947	0.8534879510610117	0.921093504340286	0.7428130389625239	104947
100	97075975	104948	0.8389202271601174	0.9171494454396463	0.7273030453176811	104948
100	97180922	104947	0.8540406109750636	0.9214460632509743	0.7422984935253032	104947
100	97180922	104948	0.8390917406715707	0.9174638868773106	0.72741738765865	104948
100	97285869	104947	0.8536499375875441	0.9204646154725719	0.7429369110122253	104947
100	97285869	104948	0.8392441971261958	0.9167016046041849	0.7279223996645958	104948
100	97390816	104947	0.8534974796802195	0.9215127635854289	0.7427368100088616	104947
100	97390816	104948	0.8381770019438198	0.9177402142013188	0.7274840873575485	104948
100	97495763	104947	0.8536499375875441	0.9210553898634549	0.7424700086710435	104947
100	97495763	104948	0.8388535274612189	0.9174638868773106	0.7285131684262682	104948
100	97600710	104947	0.8538214527332845	0.920864817479299	0.7428320962009395	104947
100	97600710	104948	0.8398540229446965	0.9169017037008804	0.7281415558181195	104948
100	97705657	104947	0.8529162339085443	0.9214460632509743	0.7409644868362125	104947
100	97705657	104948	0.8386724854213515	0.9172733163090292	0.7260643366238518	104948
100	97810604	104947	0.85465997122357	0.9216271070159223	0.7418506484225371	104947
100	97810604	104948	0.8391870259557114	0.9175020009909669	0.7274364447154782	104948
100	97915551	104947	0.853611823110713	0.9210363326250393	0.7422984935253032	104947
100	97915551	104948	0.8388821130464611	0.9167587757746694	0.7269600182947745	104948
100	98020498	104947	0.8534117221073494	0.9206933023335588	0.7425367090054981	104947
100	98020498	104948	0.8396062812059305	0.9164824484506613	0.7277223005679003	104948
100	98125445	104947	0.8531353921503235	0.9211125615787016	0.7412122309356152	104947
100	98125445	104948	0.8383866295689294	0.9172066166101307	0.7271124747493997	104948
100	98230392	104947	0.8534403079649728	0.9208552888600913	0.7421936787140175	104947
100	98230392	104948	0.8383771010405153	0.916977931928193	0.7271410603346419	104948
100	98335339	104947	0.8532687928192325	0.9212459622476107	0.7416124329423424	104947
100	98335339	104948	0.8382722872279605	0.9173590730647558	0.7272744597324389	104948
100	98440286	104947	0.852611318093895	0.9211983191515718	0.74235566524055	104947
100	98440286	104948	0.8385772001372108	0.9170732172123337	0.7267313336128368	104948
100	98545233	104947	0.8531353921503235	0.9221511810723508	0.7416600760383812	104947
100	98545233	104948	0.8375862331821473	0.9173495445363418	0.7265026489308991	104948
100	98650180	104947	0.8533926648689338	0.9203026289460394	0.7419649918530306	104947
100	98650180	104948	0.8393871250524069	0.9172733163090292	0.7277508861531425	104948
100	98755127	104947	0.8531544493887391	0.9214937063470132	0.7420412208066929	104947
100	98755127	104948	0.8377577466936006	0.9173304874795136	0.7266074627434539	104948
100	98860074	104947	0.854202597501596	0.9205122585686109	0.7426510524359915	104947
100	98860074	104948	0.8394538247513054	0.9162632922971377	0.7280462705339787	104948
100	98965021	104947	0.8534307793457651	0.921503234966221	0.7409644868362125	104947
100	98965021	104948	0.8391107977283988	0.9171970880817166	0.7264168921751725	104948
100	99069968	104947	0.8520396009414276	0.9214079487741431	0.7419649918530306	104947
100	99069968	104948	0.8371288638182719	0.9178259709570454	0.7278271143804551	104948
100	99174915	104947	0.8538786244485311	0.921093504340286	0.7418887628993682	104947
100	99174915	104948	0.8388535274612189	0.9168254754735679	0.7272363456187826	104948
100	99279862	104947	0.8536022944915053	0.9220368376418573	0.7423461366213422	104947
100	99279862	104948	0.8377291611083584	0.9174543583488965	0.7281510843465335	104948
100	99384809	104947	0.8532878500576482	0.9210744471018705	0.7416505474191735	104947
100	99384809	104948	0.8389488127453596	0.9167778328314975	0.726616991271868	104948
100	99489756	104947	0.8527542473820119	0.9217033359695846	0.741764890849667	104947
100	99489756	104948	0.8382341731143043	0.9179117277127721	0.7269981324084308	104948
100	99594703	104947	0.8522301733255834	0.920598016141481	0.7416124329423424	104947
100	99594703	104948	0.8384914433814842	0.9166253763768724	0.7268647330106338	104948
100	99699650	104947	0.8528590621932975	0.9215794639198833	0.7415552612270956	104947
100	99699650	104948	0.837671989937874	0.9170065175134352	0.7275889011701033	104948
100	99804597	104947	0.8540024964982325	0.9211697332939484	0.7426796382936148	104947
100	99804597	104948	0.8380912451880932	0.9166444334337005	0.7273983306018219	104948
100	99909544	104947	0.8536308803491286	0.9212459622476107	0.7412979885084853	104947
100	99909544	104948	0.8384247436825857	0.9168921751724664	0.7266741624423524	104948
100	100014491	104947	0.8538690958293234	0.9219701373074027	0.7423461366213422	104947
100	100014491	104948	0.838729656591836	0.917254259252201	0.7281987269886039	104948
100	100119438	104947	0.8541263685479337	0.9216366356351301	0.7411455306011606	104947
100	100119438	104948	0.8385105004383123	0.9179784274116706	0.7273983306018219	104948
100	100224385	104947	0.8541835402631804	0.9209696322905848	0.7427368100088616	104947
100	100224385	104948	0.8381198307733354	0.9157963944048482	0.7284274116705416	104948
100	100329332	104947	0.8532687928192325	0.9213031339628575	0.742174621475602	104947
100	100329332	104948	0.8379959599039525	0.9171208598544041	0.7273983306018219	104948
100	100434279	104947	0.8542788264552583	0.9203502720420784	0.7414599750350177	104947
100	100434279	104948	0.8385772001372108	0.9167587757746694	0.7264645348172428	104948
100	100539226	104947	0.8543741126473363	0.920598016141481	0.741993577710654	104947
100	100539226	104948	0.8389488127453596	0.9170636886839196	0.7281987269886039	104948
100	100644173	104947	0.8538214527332845	0.9216652214927535	0.7415457326078878	104947
100	100644173	104948	0.837929260205054	0.9174067157068262	0.7264264207035865	104948
100	100749120	104947	0.8534593652033884	0.9208552888600913	0.7428225675817317	104947
100	100749120	104948	0.8386153142508671	0.9163871631665206	0.7273411594313374	104948
100	100854067	104947	0.8544026985049596	0.9215222922046367	0.7431465406347966	104947
100	100854067	104948	0.8382722872279605	0.9177402142013188	0.7278271143804551	104948
100	100959014	104947	0.8540977826903103	0.9211697332939484	0.7417744194688748	104947
100	100959014	104948	0.8385200289667264	0.9167778328314975	0.7266074627434539	104948
100	101063961	104947	0.8536404089683364	0.9212078477707796	0.7417363049920436	104947
100	101063961	104948	0.837271791744483	0.9170160460418493	0.7258832945839845	104948
100	101168908	104947	0.8538309813524922	0.9211697332939484	0.7423080221445111	104947
100	101168908	104948	0.8367763082669513	0.9172352021953729	0.7272268170903685	104948
100	101273855	104947	0.8535927658722975	0.9216652214927535	0.7424700086710435	104947
100	101273855	104948	0.838338986926859	0.9177211571444907	0.7273697450165797	104948
100	101378802	104947	0.8534212507265572	0.9212745481052341	0.7416886618960047	104947
100	101378802	104948	0.8368334794374357	0.9171685024964744	0.7270934176925715	104948
100	101483749	104947	0.8541835402631804	0.9210363326250393	0.7422032073332253	104947
100	101483749	104948	0.8374909478980066	0.9176163433319359	0.7276174867553455	104948
100	101588696	104947	0.8549458297998037	0.9215127635854289	0.7423175507637189	104947
100	101588696	104948	0.8376910469947021	0.9166634904905286	0.7265217059877273	104948
100	101693643	104947	0.8533164359152715	0.9202263999923771	0.7411931736971995	104947
100	101693643	104948	0.8376148187673895	0.9166730190189427	0.7268552044822197	104948
100	101798590	104947	0.8532402069616092	0.9207028309527666	0.7417267763728358	104947
100	101798590	104948	0.8377958608072569	0.916844532530396	0.7262739642489614	104948
100	101903537	104947	0.8537738096372455	0.9208362316216757	0.7415266753694723	104947
100	101903537	104948	0.8375004764264207	0.9173400160079277	0.7263883065899303	104948
100	102008484	104947	0.8543359981705051	0.9208838747177146	0.7413742174621476	104947
100	102008484	104948	0.8377196325799443	0.9167206616610131	0.7268647330106338	104948
100	102113431	104947	0.8533450217728948	0.9212554908668185	0.7419268773761994	104947
100	102113431	104948	0.8373099058581392	0.9167683043030834	0.7271696459198841	104948
100	102218378	104947	0.8541835402631804	0.9211220901979095	0.7419364059954072	104947
100	102218378	104948	0.838196059000648	0.9169112322292945	0.7273411594313374	104948
100	102323325	104947	0.8531925638655702	0.9218843797345326	0.7413551602237319	104947
100	102323325	104948	0.8370526355909593	0.9162632922971377	0.7270076609368449	104948
100	102428272	104947	0.8546885570811934	0.9212936053436497	0.7411074161243294	104947
100	102428272	104948	0.8388821130464611	0.9169684033997789	0.7269123756527042	104948
100	102533219	104947	0.8538786244485311	0.9209886895290004	0.7418506484225371	104947
100	102533219	104948	0.8371479208751	0.9173019018942714	0.7262072645500629	104948
100	102638166	104947	0.8539072103061546	0.9215127635854289	0.7421460356179786	104947
100	102638166	104948	0.8365952662270839	0.9171589739680603	0.7267313336128368	104948
100	102743113	104947	0.8538309813524922	0.9214079487741431	0.7409549582170047	104947
100	102743113	104948	0.8368620650226779	0.9169588748713648	0.7258070663566719	104948
100	102848060	104947	0.8533164359152715	0.9221511810723508	0.7417744194688748	104947
100	102848060	104948	0.8373480199717955	0.9184167397187178	0.7266360483286961	104948
100	102953007	104947	0.854974415657427	0.9210077467674159	0.7415171467502644	104947
100	102953007	104948	0.8375290620116629	0.9176068148035218	0.7270934176925715	104948
100	103057954	104947	0.8539262675445701	0.9207885885256367	0.7430798403003421	104947
100	103057954	104948	0.837281320272897	0.9171018027975759	0.7279605137782521	104948
100	103162901	104947	0.8541263685479337	0.9209219891945458	0.7420602780451084	104947
100	103162901	104948	0.8368430079658498	0.9163871631665206	0.7269981324084308	104948
100	103267848	104947	0.8548600722269336	0.9210839757210783	0.7408977865017581	104947
100	103267848	104948	0.8385200289667264	0.9171303883828181	0.727026717993673	104948
100	103372795	104947	0.8544693988394142	0.9207790599064289	0.7414695036542255	104947
100	103372795	104948	0.8378625605061555	0.9163490490528643	0.726759919198079	104948
100	103477742	104947	0.8534307793457651	0.9209505750521692	0.7429750254890564	104947
100	103477742	104948	0.8375195334832488	0.9173495445363418	0.727550787056447	104948
100	103582689	104947	0.8540120251174402	0.9209219891945458	0.7410216585514593	104947
100	103582689	104948	0.8378435034493272	0.9171399169112322	0.7273697450165797	104948
100	103687636	104947	0.8545360991738687	0.9205027299494031	0.7422413218100565	104947
100	103687636	104948	0.8375767046537332	0.9165205625643176	0.7269981324084308	104948
100	103792583	104947	0.8538976816869468	0.919683268697533	0.7418601770417449	104947
100	103792583	104948	0.8383199298700309	0.9159964935015437	0.7261500933795785	104948
100	103897530	104947	0.8545837422699076	0.9205884875222732	0.7412789312700696	104947
100	103897530	104948	0.837814917864085	0.9163490490528643	0.7274269161870641	104948
100	104002477	104947	0.8543836412665441	0.919959598654559	0.7419649918530306	104947
100	104002477	104948	0.8380150169607806	0.9160536646720281	0.7264168921751725	104948
100	104107424	104947	0.8540596682134792	0.9208362316216757	0.7414980895118488	104947
100	104107424	104948	0.8381198307733354	0.9177021000876625	0.7268361474253916	104948
100	104212371	104947	0.8533069072960637	0.9205408444262342	0.7402974834916672	104947
100	104212371	104948	0.837014521477303	0.9164919769790754	0.7267694477264931	104948
100	104317318	104947	0.8548314863693103	0.9202740430884161	0.7413551602237319	104947
100	104317318	104948	0.8381770019438198	0.9167301901894271	0.7272268170903685	104948
100	104422265	104947	0.8540120251174402	0.9210744471018705	0.7413361029853164	104947
100	104422265	104948	0.8379006746198118	0.917387658649998	0.7263883065899303	104948
100	104527212	104947	0.854478927458622	0.9196165683630785	0.7415457326078878	104947
100	104527212	104948	0.8382341731143043	0.9156725235354651	0.7269886038800167	104948
100	104632159	104947	0.8535070082994274	0.9208838747177146	0.7416505474191735	104947
100	104632159	104948	0.8377101040515302	0.9166349049052864	0.7279890993634943	104948
100	104737106	104947	0.8539072103061546	0.9210268040058315	0.74153620398868	104947
100	104737106	104948	0.8372146205739985	0.9169684033997789	0.7270934176925715	104948
100	104842053	104947	0.8536308803491286	0.9200548848466369	0.7421555642371864	104947
100	104842053	104948	0.8380340740176088	0.9160631932004422	0.7272839882608529	104948
100	104947000	104947	0.8543169409320895	0.9217986221616625	0.7407834430712645	104947
100	104947000	104948	0.8379006746198118	0.9169969889850211	0.7255783816747342	104948
In [92]:
cor_array = []
for x,y1,y2,y3 in zip(XTrain,y1_train, y2_train, y3_train):
    inputs = np.asfarray(x)
    targets = np.zeros(output_nodes) + 0.01
    targets[y1] = .99
    targets[y1_n+y2] = .99
    targets[y1_n+y2_n+y3] = .99
    cor_array.append(sc.stats.spearmanr(np.asfarray([x[0] for x in n.back_query(targets)]),inputs).correlation)
    pass
import matplotlib.pyplot as plt
plt.hist(cor_array, bins='auto')  # arguments are passed to np.histogram
plt.title("Histogram with 'auto' bins")
plt.show()
In [52]:
n.create_word_cloud(plot_cloud=True, cuttoff=0.01, abs_pass=False, 
                    country='US', category='Red', taster='Paul Gregutt',
                  vectorizer=vectorizer,scaler= scaler, svd=svd1 ,
                    map_mask_path="wine2_removed.png")
Out[52]:
{'2020': 12,
 'accented': 10,
 'acidity': 44,
 'all': 15,
 'almond': 17,
 'almost': 13,
 'although': 14,
 'ample': 13,
 'and': 108,
 'apple': 33,
 'aromas': 31,
 'balance': 11,
 'barrel': 10,
 'bitter': 15,
 'blend': 11,
 'body': 15,
 'bright': 17,
 'brisk': 13,
 'cabernet': 19,
 'candied': 18,
 'candy': 11,
 'cassis': 17,
 'character': 16,
 'chardonnay': 16,
 'cherries': 15,
 'cherry': 36,
 'citrusy': 12,
 'cocoa': 10,
 'coffee': 23,
 'cola': 11,
 'color': 16,
 'cranberry': 14,
 'creamy': 28,
 'crisp': 26,
 'delicious': 17,
 'depth': 16,
 'does': 10,
 'drink': 14,
 'drinking': 37,
 'dry': 47,
 'easy': 22,
 'enough': 18,
 'estate': 22,
 'finish': 18,
 'finishes': 25,
 'first': 27,
 'flavors': 36,
 'flowers': 15,
 'food': 10,
 'for': 50,
 'franc': 19,
 'from': 26,
 'fruit': 27,
 'fruits': 10,
 'good': 22,
 'grape': 15,
 'grapefruit': 12,
 'grapes': 18,
 'green': 19,
 'grown': 14,
 'hard': 23,
 'high': 16,
 'in': 46,
 'is': 58,
 'it': 23,
 'jammy': 12,
 'juicy': 21,
 'layered': 14,
 'lean': 11,
 'lemon': 17,
 'light': 22,
 'lingering': 11,
 'lively': 28,
 'long': 14,
 'lovely': 14,
 'lush': 12,
 'made': 13,
 'merlot': 27,
 'much': 11,
 'noir': 29,
 'notes': 28,
 'now': 31,
 'oaky': 19,
 'of': 51,
 'offering': 16,
 'peach': 33,
 'peaches': 10,
 'pear': 18,
 'pepper': 12,
 'peppery': 15,
 'petit': 15,
 'pie': 11,
 'pineapple': 12,
 'pink': 10,
 'pinot': 46,
 'pleasant': 20,
 'plenty': 19,
 'pretty': 16,
 'price': 11,
 'provide': 11,
 'quality': 11,
 'quite': 17,
 're': 10,
 'red': 22,
 'rich': 11,
 'richness': 11,
 'right': 24,
 'ripe': 12,
 'rosã': 11,
 'rounded': 14,
 'short': 21,
 'show': 11,
 'silky': 20,
 'smells': 10,
 'smoky': 14,
 'so': 24,
 'some': 26,
 'sour': 24,
 'straightforward': 11,
 'structure': 25,
 'sugary': 14,
 'sweet': 31,
 'tannins': 21,
 'tastes': 19,
 'texture': 22,
 'that': 14,
 'the': 56,
 'then': 15,
 'they': 11,
 'this': 25,
 'to': 16,
 'toast': 15,
 'toasted': 13,
 'tropical': 11,
 'two': 12,
 'up': 40,
 'vanilla': 12,
 'verdot': 14,
 'vintage': 11,
 'weight': 13,
 'white': 22,
 'wine': 16,
 'winery': 10,
 'with': 85,
 'year': 12,
 'years': 15,
 'yellow': 19}

Website to dynamically predict and produce backwards neural network word clouds:

https://arithmeticr.pythonanywhere.com/

Project Schedule.

Make sure that you plan your work so that you can avoid a big rush right before the final project deadline, and delegate different modules and responsibilities among your team members. Write this in terms of weekly deadlines.

DateItemAssignedDetails
Finish March 27, 2018Acquire Data Brian Lead Pulling data from https://www.winemag.com using python.
Finish March 29, 2018Clean Data Li Lead Start on prelim data.
Finish March 29, 2018Explore Data Trevor Lead Start on prelim data
Start March 29, 2018Start Analysis Team Run prelim models on full data before milestone.
Due Apr 1, 2018Project Milestone Due Team

For your milestone, we expect you to have acquired, cleaned, and explored your dataset. You should also explain in more detail what will go into your final analysis. Explain deviations from your initial project plan.

Finish Apr 6, 2018Finish Analysis Team
What are the key drivers for wine points and wine price? Penalised linear models (elastic net) and trees.
Which wines are systematically over or under valued in terms of price? Penalised linear models, random forests, gradient boosting machines, k-NNs, SVMs, and Neural Networks.
What attributes per wine should you use to market them? PCA, FA, SEMs, and Bayesian Networks.
How similar are different wines varieties? Clustering and profiling.
Finish Apr 6, 2018Document findings Team Update Jupyter Notebook.
Finish Apr 13, 2018Create Prediction Tool Team Create Html tool to output results and visualization dynamically
?Project Review with the staffTeam
Finish Apr 13, 2018Develop screen-cast Team

You must also include a three minute video including audio walking us through your project. Each team will create a three minute screen-cast with narration showing a demo of your project and/or some slides. You can use any screencast tool of your choice. Please make sure that the sound quality of your video is good. Upload the video to an online video-platform such as YouTube or Vimeo and link to it from your notebook.

We will strictly enforce the three minute time limit for the video, so please make sure you are not running longer. Use principles of good storytelling and presentations to get your key points across. Focus the majority of your screencast on your main contributions rather than on technical details. What do you feel is the best part of your project? What insights did you gain? What is the single most important thing you would like your audience to take away? Make sure it is front and center rather than at the end.

Finish Apr 20, 2018 Complete peer evaluations Team

It is important to provide positive feedback to people who truly worked hard for the good of the team and to also make suggestions to those you perceived not to be working as effectively on team tasks. We ask you to provide an honest assessment of the contributions of the members of your team, including yourself. The feedback you provide should reflect your judgment of each team member’s:

  • Preparation – were they prepared during team meetings?
  • Contribution – did they contribute productively to the team discussion and work?
  • Respect for others’ ideas – did they encourage others to contribute their ideas?
  • Flexibility – were they flexible when disagreements occurred?

Your teammate’s assessment of your contributions and the accuracy of your self-assessment will be considered as part of your overall project score.

Due Apr 22, 2018Final Project Due Team

For your final project you must complete the analysis in your notebook and present your results in a compelling way.

?Project Presentation Team

Each team will be given a brief slot (~5 minutes) to present their project in one of the two last lectures. Present your analysis questions and your main contributions, but also explain your methods and justify your choices. What do you feel is the best part of your project? What insights did you gain? What is the single most important thing you would like your audience to take away?

Peer Review

Group Name: Weather our power consumption will change?

Reviewers: Brian Tillman and Trevor Olsen

Group Member Names: Aaron Young

Objective: Identifying how much power is being used depending on the weather forecast. Witch buildings use the most? Model will not handle real time data but will predict daily based on weather.

Dataset: Collecting weather data from different sources, and power consumption of University of Utah buildings.

Data Processing: Data must be collected in the middle of the month because of pay system first week of the month. Cleaning the data is going to be time intensive, the project is going to depend on how fast they can collect the data.

Exploratory Analysis: Use the client tool provided by the university to do introductory information and explore the variables that are affected the most by weather. Built in visualization tools will help with analysis.

Analysis Methods: None linear auto regressive with exogenous Data Visualization for weather factors on power consumption. Focus on the power usage not cost.

Must have Features: Want to have good model to correlate with weather data. Pick what building the user is looking at. Overall energy consumption on campus (Sum all equipment on campus).

Optional Features: Explore the cost model Heat map the data on to the university of Utah campus map. Add energy usage to the campus map Separate models for bigger buildings

Schedule: Data Collected over spring break

General Questions:

• Very interesting and unique data set. Operational data collection and presenting to the university facility board. • The model will be daily not real time data. Pulling slices of data vs the model.

Data Acquisition and Clean Up:

• The data acquisition is going to need a bit of clean up because the data is large and trouble accessing. The project will solve a solution for other building managers. • Go to browsers-based API and then scrap for unique identifiers. • Hundreds of buildings and each building has at least 4 sections in the building each contacting 4 parts. No hospital data and other buildings are restricted. • Gathering weather data from multiple sources. • How are you going to store the data? Pull the data and then pull data for a training set, then use the data from one data to

Analysis Methodology:

• Equipment clustering could be used to visualize the coalition to each other and weather. In a moment of time the equipment’s will have a different distance. • Random effects model with in the time frame. • Heat map the data on to the university of Utah campus map could be more important for the project. • Equipment is going to vary, and you can find meters that are not correlated with each other.